Milestone 4 Task authoring - Training interventions in Task Authorship for Requesters

From crowdresearch
Revision as of 15:38, 7 February 2016 by Angelarichmondfuller (Talk | contribs) (The Puzzle)

Jump to: navigation, search

Outline ž

• What’s the phenomenon you’re interested in? Specific phenomemon! Not just “crowdsourcing”. More like what makes teams of workers effective. ž

Opportunities for work are being digitally transformed. Requesters want to post tasks and get timely high quality results. They want to minimise the cost and time it takes for their work to be completed.

Requesters want their work to be completed to a high standard by workers they can trust have the relevant skills required.

1) Do requesters identify that they require training to acquire skills relevant to certain tasks? If so, what kind?

2) Do requesters want to access training relevant to completion of certain tasks on a crowdsource marketplace?

3) Do requesters access the training?

4) Does the quality of the tasks created by the requesters who accessed the training improve?

The Puzzle

• What observation can’t we account for yet? ž

The experimental design

ž Experiment 1:

• Who are you recruiting?

  Three groups: 
  1) Experienced microtask requesters 
  2) novice requesters (requesters who have never posted a task on a microtask platform) 
  3) novice requesters who undertake training in content specific task creation

• What are the conditions?

• What are you measuring? What statistical procedure will you use? ž

  1) Quality of task creation in the group of experienced microtask requesters compared with the novice requesters then compared with the novice requesters who have undertaken training in the content specific task creation
  2) Does training in microtask creation achieve good results?
  3) If yes, why? If not, what else do newcomers identify as what type of training for microtask creation would newcomers want instead?

The result

• What (do you imagine) would happen?

The overall quality of the task will improve. Workers will spend less time trying to ascertain what the requester is requesting.

Future work

Offer templates

Collectives of workers to identify good practice and create task templates. Requesters to use/amend task templates. Tasks submitted go to pool of workers related to the task-type for approval/release to workers.

Link to meta-curriculum and provide triggers to both worker and requesters for the acquisition and updating of skills. Training Workers for Improving Performance in Crowdsourcing Microtasks Toward a Learning Science for Complex Crowdsourcing Tasks

An Experiment to do at some point in the future:

The economy and workers’ lives have been digitised. Workers want to access opportunities for work as and when suits their individual circumstances. Workers want pathways for learning and skills. They want it geared to their own needs and to their own level of challenge. Workers also want to be able to share their achievements. They want content and resources tailored to their individual needs and accessible for future reference. Workers do not want to be restricted to working in one skill area but for crowdsourcing markets to allow for branching out to other areas of work without barriers. Workers want timely, relevant opportunities for updating their skills

Experiment 2:

• Who are you recruiting? Experienced microtask workers vs Workers who have never completed work on a microtask platform - complete newcomer • What are the conditions? • What are you measuring? What statistical procedure will you use? ž o 1. Quality of task completion o 2. What type of training for microtask completion is necessary o 2. What other types of

The result ž • What (do you imagine) would happen?

Contributor: @arichmondfuller

Details on Milestone 4: