Milestone 8 Sundevils Foundation2

From crowdresearch
Revision as of 23:46, 22 April 2015 by Vignesh (Talk | contribs) (Challenge question 3)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Foundation 2: Input and output transducers

Challenge question 1:Cost, who pays for this? In other words, can this be done without hugely increasing the cost of crowdsourcing?


Having the platform to pay for the review of each task before getting posted would be very expensive. We suggest a "Reviewer System" to overcome this. The procedure involved is:

1.) The task is posted in a pool consisting of experts belonging to a particular skill community. The task is posted to a skill pool based on the tags provided by the requester.

2.) A requester volunteers from the pool to take up the task and review it. The platform does not pay the reviewer.

3.) For each finished task, the volunteer gets a credit point.

4.) When the credit points of a volunteer reaches a threshold he gets a monetary reward for the points earned.

M8 F2 1.jpg

Challenge question 2:Speed is it possible to do this quickly enough to give near-immediate feedback to requesters?

As we are trying to cut back on cost we are going to have a trade off with speed. To improve speed we can have an algorithm to check and clear the grammar and syntax errors. This reduces the time taken to review the task. As the work is made easier for volunteers, taking up of a task for reviewing would be faster.

Challenge question 3:What happens when I have a task that I know is hard but I want workers to just try their best and submit?

Caveat for the Reviewer

When a task is abstract or difficult, the requester leaves a caveat ie, a warning notice to the reviewer. On seeing a caveat, a reviewer would closely look into the task and approve it even though its an abstract or difficult task.