Milestone 8 BufferOverflow Foundation2

From crowdresearch
Jump to: navigation, search

Foundation 2: Input and output transducers

Tasks get vetted or improved by people on the platform immediately after getting submitted, and before workers are exposed to them. Results are likewise vetted and tweaked. For example, peer-review.

Cost: who pays for this? In other words, can this be done without hugely increasing the cost of crowdsourcing?

Each task is submitted to a specific sub community depending on the task category. It is pinned to the top until at least three(or less depending on the size of the committee) members review it. So there is no need for any payment, this is how the CodeProject works. In return, members that review could get early access to each job. Each member also can comment on the quality of work for other members to see.

Speed: is it possible to do this quickly enough to give near-immediate feedback to requesters? Like, 2–4 minutes? As spamgirl says, "The #1 thing that requesters love about AMT from her recent survey of requesters, is that the moment that I post tasks, they start getting done."

Manual review takes time. There could be some automatic review like bots. Each bot should be programmed by senior members and they should “do no harm”. But manual review time depends on the worker/job ratio among other things. If a requester chooses, she can check the automatic review instead of manual, but her work would be subject to removal without notice and it would carry a badge indicated as automated review.

From Edwin: What happens when I have a task that I know is hard but I want workers to just try their best and submit? I’m OK with it being subjective, but the panel would just reject my task, which would have been frustrating.



You check the automatic review process. Moreover, manual approval is unidirectional(?). Meaning that no one can reject your work. Just three members have to approve. Note that community members aren’t necessarily workers alone. There could be requesters among them as well.

From Edwin: Could this help deal with people feeling bad when rejecting work? Maybe we need a new metaphor, like revision.

There is no rejection. Requesters will receive comments from community members but that’s it. If they get three votes, they’re in. Meanwhile, requesters can also comment on those comments or reflect them in their job. To support this scenario, each job is pinned only for three days until it is unpinned. If anything about the job is updated, it gets pinned again, until it gets too many downvotes (there is a downvote!). Then the job will be flagged as spam with penalty for the requester.