Difference between revisions of "Milestone 4"

From crowdresearch
Jump to: navigation, search
Line 5: Line 5:
  
 
== Themes from Milestone 3 ==
 
== Themes from Milestone 3 ==
 +
 +
=== Descriptions: How might workers+requesters work together help produce higher-quality task descriptions? ===
 +
Example ideas of this theme:
 +
New tasks go to workers first who improve the task before it goes live. http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Pumas_PowerIdea_2:_lovePower
 +
New tasks go into a holding pattern where they need to be "voted in” by workers: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Triple_Clicks_PowerIdea_1:_Voting_on_HIT_design, also http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Team_Innovation_2_PowerIdea1:_Worker_Feedback_on_Tasks
 +
Task templates (can we do better than AMT’s templates?): http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_PixelPerfect_PowerIdea_2:_Task_Templates
 +
Stronger categorization of work: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_TuringMachine_TrustIdea_1:_Distribution_of_work_according_to_workers%27_interests
 +
Artificial turker that has to understand tasks before they go live: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Betzy_DarkHorseIdea:_Artificial_Turker
 +
 +
=== Results: How might workers+requesters work together help produce higher-quality results? ===
 +
Workers review other workers’ work (like on MobileWorks): http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_pentagram_TrustIdea_2:_Peer_Review_System
 +
Could we create a crowd contractor who is in charge of each submission getting good results? (M). Related to crowd managers: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_taskforce_poweridea1_qualitycontrolmanagers
 +
 +
=== Resolution: How might we enable a fairer dispute resolution process? ===
 +
Moderators can review if rejected work should be disputed: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_YourTeamName_PowerIdea_1:_Moderator_justification_for_rejected_hit, also http://crowdresearch.stanford.edu/w/index.php?title=Moderators_as_arbiters_of_disputes
 +
Dispute resolution process: http://crowdresearch.stanford.edu/w/index.php?title=Dispute_resolution
 +
Bad actors get hellbanned to a separate circle for a time being, or their tasks done more slowly: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_pentagram_TrustIdea_1:_Priority_System
 +
Force requesters to mediate previous problems before they can post new tasks moving forward: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_munichkindl_DarkHorseIdea_HangoutWithKetchup
 +
 +
=== Empathy: How might we build empathy? ===
 +
Meetings outside of the system to build trust: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_taskforce_darkehorseidea_crowdbeers
 +
Make workers act as requesters, and visa versa: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Pumas_TrustIdea_1:_empathy
 +
Requesters can send gifts to workers, or a certain amount of their costs are reserved to pay bonuses to workers: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Opera_DarkHorse:Surprise_gifts, or also http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Team_Innovation_2_DarkHorseIdea:_Bonus_Pool_Payment_System
 +
Humanizing worker profiles: http://crowdresearch3.meteor.com/posts/XbuMSCMAM2XNxjM2W
 +
 +
=== Transparent: How might we make payment clear and transparent? ===
 +
Standardize task pricing: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_PixelPerfect_DarkHorseIdea:_Standardization_of_Task_Pricing
 +
Required minimum wage? if you stay above the 15th percentile of “good work throughput”?: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_TestSet_DarkHorseIdea
 +
Checkpoints where you get reviewed and paid after every N tasks: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_ams_TrustIdea_2:_Checkpoint_System
 +
Offer increased compensation to the first few people who take the task: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Mustang_DarkHorse:_The_Kickstarter_Model
 +
 +
=== Systems: How might we design better reputation systems? ===
 +
Leveling up as a worker and requester: gives you better wages (M), first availability of tasks for workers, etc: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_PixelPerfect_TrustIdea_2:_User_Rating_System, also http://crowdresearch.stanford.edu/w/index.php?title=Different_Levels_of_workers
 +
Ranking top workers and making them most available: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_ams_PowerIdea_1:_Bring_Top_Workers_Closer_to_Requesters
 +
Workers rate requesters as they do tasks, used to learn matching for other workers
 +
Tasks are first available to workers who match according to skill and performance
 +
Skill categories on the platform: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Professionals_PowerIdea_1:_Expose_worker_skills
 +
 +
=== Other: Interesting dark horse ideas outside of these themes ===
 +
Supply/demand curves for pricing — based on task classes
 +
Democracy to decide on policy for the platform: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_RATH_Power_Idea_2:_Systemic_Democracy
  
 
== Recommended Readings ==
 
== Recommended Readings ==

Revision as of 01:48, 21 March 2015

During this milestone we'll have you refine your ideas from the past milestone.

  • Youtube link of the meeting today: (Will be uploaded later)
  • Meeting 3 slideshow: (Will be uploaded later)

Themes from Milestone 3

Descriptions: How might workers+requesters work together help produce higher-quality task descriptions?

Example ideas of this theme: New tasks go to workers first who improve the task before it goes live. http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Pumas_PowerIdea_2:_lovePower New tasks go into a holding pattern where they need to be "voted in” by workers: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Triple_Clicks_PowerIdea_1:_Voting_on_HIT_design, also http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Team_Innovation_2_PowerIdea1:_Worker_Feedback_on_Tasks Task templates (can we do better than AMT’s templates?): http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_PixelPerfect_PowerIdea_2:_Task_Templates Stronger categorization of work: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_TuringMachine_TrustIdea_1:_Distribution_of_work_according_to_workers%27_interests Artificial turker that has to understand tasks before they go live: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Betzy_DarkHorseIdea:_Artificial_Turker

Results: How might workers+requesters work together help produce higher-quality results?

Workers review other workers’ work (like on MobileWorks): http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_pentagram_TrustIdea_2:_Peer_Review_System Could we create a crowd contractor who is in charge of each submission getting good results? (M). Related to crowd managers: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_taskforce_poweridea1_qualitycontrolmanagers

Resolution: How might we enable a fairer dispute resolution process?

Moderators can review if rejected work should be disputed: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_YourTeamName_PowerIdea_1:_Moderator_justification_for_rejected_hit, also http://crowdresearch.stanford.edu/w/index.php?title=Moderators_as_arbiters_of_disputes Dispute resolution process: http://crowdresearch.stanford.edu/w/index.php?title=Dispute_resolution Bad actors get hellbanned to a separate circle for a time being, or their tasks done more slowly: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_pentagram_TrustIdea_1:_Priority_System Force requesters to mediate previous problems before they can post new tasks moving forward: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_munichkindl_DarkHorseIdea_HangoutWithKetchup

Empathy: How might we build empathy?

Meetings outside of the system to build trust: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_taskforce_darkehorseidea_crowdbeers Make workers act as requesters, and visa versa: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Pumas_TrustIdea_1:_empathy Requesters can send gifts to workers, or a certain amount of their costs are reserved to pay bonuses to workers: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Opera_DarkHorse:Surprise_gifts, or also http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Team_Innovation_2_DarkHorseIdea:_Bonus_Pool_Payment_System Humanizing worker profiles: http://crowdresearch3.meteor.com/posts/XbuMSCMAM2XNxjM2W

Transparent: How might we make payment clear and transparent?

Standardize task pricing: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_PixelPerfect_DarkHorseIdea:_Standardization_of_Task_Pricing Required minimum wage? if you stay above the 15th percentile of “good work throughput”?: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_TestSet_DarkHorseIdea Checkpoints where you get reviewed and paid after every N tasks: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_ams_TrustIdea_2:_Checkpoint_System Offer increased compensation to the first few people who take the task: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Mustang_DarkHorse:_The_Kickstarter_Model

Systems: How might we design better reputation systems?

Leveling up as a worker and requester: gives you better wages (M), first availability of tasks for workers, etc: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_PixelPerfect_TrustIdea_2:_User_Rating_System, also http://crowdresearch.stanford.edu/w/index.php?title=Different_Levels_of_workers Ranking top workers and making them most available: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_ams_PowerIdea_1:_Bring_Top_Workers_Closer_to_Requesters Workers rate requesters as they do tasks, used to learn matching for other workers Tasks are first available to workers who match according to skill and performance Skill categories on the platform: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_Professionals_PowerIdea_1:_Expose_worker_skills

Other: Interesting dark horse ideas outside of these themes

Supply/demand curves for pricing — based on task classes Democracy to decide on policy for the platform: http://crowdresearch.stanford.edu/w/index.php?title=Milestone_3_RATH_Power_Idea_2:_Systemic_Democracy

Recommended Readings

Mason W, Watts D J. Financial incentives and the performance of crowds. ACM SigKDD Explorations Newsletter, 2010, 11(2): 100-108.

Dow S, Kulkarni A, Klemmer S, et al. Shepherding the crowd yields better work. Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work. ACM, 2012: 1013-1022.

Deliverables for Milestone 4

Read Ideas from Milestone 3

The first step to refining ideas if to become familiar with all the ideas that were generated by all the teams. Your first task is to read through and classify every idea generated into the themes that were mentioned above. You do not have to submit anything for this part. Create a list of as many ideas as you can read and classify them into the above themes.

Detailed Exploration of Ideas

From the above themes, choose 3 themes that you are most interested in and develop an idea that combines a subset of the ideas related to that theme from Milestone 3. You may use an existing idea from Milestone 3 and expand upon it by exploring it's feasibility.

In Milestone 3, you had the opportunity to flare and consider many different ideas. Now, we have focused in on specific themes that have arisen from your ideas. The main purpose of this milestone is to flare again by generating ideas related to these themes. You are required to come up with 3 detailed ideas. Each idea must be from a different theme. And you may use at most one of the existing ideas from Milestone 3 and expand on it. The other two ideas must be original and should not ones that have already been presented in Milestone 3.

1. Create a succinct description of the problem that the theme is trying to address.

2. Summarize a few ideas related to this theme that try to address this problem from Milestone 3.

3. Delve deep and explain your idea and how it can be used to solve the problem you described in your problem statement. Consider how your idea can be used. Mention it's limitations. Use charts, sketches, diagrams, tables and anything you think will help articulate your idea.

Submitting

Create a Wiki Page for your Team's Submission

Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=Milestone_4_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at Milestone 4 Template . If you have never created a wiki page before, please see this or watch this.

[Team Leaders] Submission or Post the links to your ideas until 25th March 11:59 pm

We have a service on which you can post the links to the wiki-pages for the individual ideas you generated, explore them, and upvote them.

Instructions for posting are at http://crowdresearch.meteor.com/posts/bXSNbqihjajASBQEL

Sign-up Instructions: Log in with either Twitter or Facebook. When it asks you to pick your username, pick the same username as your Slack, this will help us identify and track your contributions better.

There are 7 possible submission categories:

1- http://crowdresearch.meteor.com/category/milestone-4-descriptions where you can post links to the wiki pages for each of the 2 description-related ideas you generated.

2- http://crowdresearch.meteor.com/category/milestone-4-results where you can post links to the wiki pages for each of the 2 results-related ideas you generated.

3- http://crowdresearch.meteor.com/category/milestone-4-empathy where you can post links to the wiki pages for each of the 2 empathy-related ideas you generated.

4- http://crowdresearch.meteor.com/category/milestone-4-resolution where you can post links to the wiki pages for each of the 2 resolution-related ideas you generated.

5- http://crowdresearch.meteor.com/category/milestone-4-transparency where you can post links to the wiki pages for each of the 2 transparency-related ideas you generated.

6- http://crowdresearch.meteor.com/category/milestone-4-systems where you can post links to the wiki pages for each of the 2 systems-related ideas you generated.

7- http://crowdresearch.meteor.com/category/milestone-4-other where you can post links to the wiki pages for each of the 2 other ideas you generated.

Post links to your ideas only once they're finished. Give your posts titles which summarize your idea. Viewers should be able to get the main point by skimming the title ("Automatic Pricing for Tasks based on Average Completion Time" is a good title. "YourTeam Idea 1" is a bad title).

-Please submit your finished ideas by 11:59 pm 25th March 2015, and DO NOT vote/comment until 26th March 12:05 am

[Everyone] Peer-evaluation (upvote ones you like, comment on them) from 12:05 am 26th March until 9 am 27th March

Post submission phase, you are welcome to browse through, upvote, and comment on others' ideas. We encourage you especially to look at and comment on ideas that haven't yet gotten feedback, to make sure everybody's ideas gets feedback. You can use http://crowdresearch.meteor.com/needcomments to find ideas that haven't yet gotten feedback, and http://crowdresearch.meteor.com/needclicks to find ideas that haven't been yet been viewed many times.

COMMENT BEST-PRACTICES: As on Crowdgrader, everybody reviews at least 3 ideas, supported by a comment. The comment has to justify your reason for upvote. The comment should be constructive, and should mention positive aspect of the idea worth sharing. Negative comments are discouraged, rather make your comment in the form of a suggestion - such as, if you disliked an idea, try to suggest improvements (do not criticize an idea, no idea is bad, every idea has a scope of improvement).

[Team Leaders] Milestone 4 Submissions

To help us track all submissions and browsing through them, once you have finished your Milestone 4, go to the link below and post the link:

Milestone 4 Submissions