WinterMilestone 3 @Yoni.Dayan Reputation And Calibrating

From crowdresearch
Jump to: navigation, search

Hello everyone :)

I will continue my exploration on the future of work, crowdsourcing, and our capacity to learn, the "human factor" (feeling of relatedness, of belonging to a community, etc.)

Problem statement

How crowdworkers can be ready for a task and how request can further trust in the capacity of crowdworkers to handle the task?

Calibrating system

- Requester would design "blank crowdwork". - Example of online learning. In MOOCs, we are doing peer reviews, aka pushing the learners to review their peers, if they want to continue. Usually, we ask them (in NovoEd) to calibrate their peer reviewing by doing blank ("counting for nothing") reviews, like 3 or 4 reviews. The purpose is for them to understand how it's done, to learn, then they are ready to do real 5 peer reviews.

- Idea of applying this system to crowdsourcing, like fake micro-tasks to do, standing as a training funnel toward the real thing - This training would act as some sort of micro-credentialization/trustalization (lol) of the new crowdworkers, like "hey guys whom i don't know, prove me you can do Y, and then you will do the real Z"

- could be also entry level crowdwork instead of fake one