Difference between revisions of "Winter Milestone 6"
|Line 1:||Line 1:|
== Task Feed ==
== Task Feed ==
Revision as of 15:47, 15 February 2016
Due date (PST): 8:00 pm 14th Feb 2016 for submission, 12 pm 15th Feb 2016 for peer-evaluation
This week, we will accept proposals to pursue different aspects of the project, and start a design test run.
Our goal for this week is to converge on a single well-defined systems section. By the end of the week, we want to produce a very specific proposal detailing exactly what needs to be built/done in order to embark on our study. You all wrote a bunch of great proposals last week, and we want to extract the best ideas so that we can synthesize a single system design.
Please be sure to have read this paper on the state of the art in personalized task recommendation in crowdsourcing systems. It is really important that our design and framing of our contributions are novel, and this paper succinctly describes what work has already been done in this domain. In particular, the Findings and Discussion sections describe previous research and what the author believes are viable future directions and Table 2 provides links to other relevant papers.
By Tuesday night / Wednesday morning, read each of last week’s submissions on Meteor and write comments in an “I like / I wish” style. For example, “I like how your proposal strives to extend Boomerang by enforcing an incentive compatible task feed that accurately estimates hourly wages” and “I wish your design accounted for workers who did not provide accurate time estimates even though they did produce good results and the effect this would have on their feed.”
On Wednesday from 9am-11am PST, @michaelbernstein will lead a hangout to synthesize these ideas and outline a single proposal. We will then take this outline and fill in any missing details during the rest of the week and create a single well-written systems section.
Michael's summary abstract after the brainstorms:
Boomerang: Incentivizing Information Disclosure in Paid Crowdsouring Platforms There is a massive amount of information necessary for a healthy crowdsourcing marketplace — for example accurate reputation ratings, skill tags on tasks, and hourly wage estimates for tasks — that is privately held by individuals, but rarely shared. We introduce Boomerang, an interactive task feed for a crowdsourcing marketplace, that incentivizes accurate sharing of this information by making the information directly impact their future tasks or workers. Requesters' ratings of workers, and their skill classifications of tasks, are used to give early access to workers who that requester rates highly and who are experts in that skill, so giving a high rating to a mediocre worker dooms the requester to more mediocre work from that worker. Workers' ratings of requesters are used to rank their high-rated requesters at the top of the task feed, and their estimates of active work time are used to estimate their hourly wage on other tasks on the platform.
The task feed hangouts from last week:
Michael's synthesized needs:
- to find new tasks that will maximize income (reduce uncertainty in payment, rejection, maximize certainty in what will be asked of me and how quickly I can do it)
- to find new tasks that fit my expertise profile
- to refind old requesters' new tasks, since I know I like them
- to identify tasks I can do on my own time
- to learn new skills