Difference between revisions of "Milestone 2"

From crowdresearch
Jump to: navigation, search
(Submitting)
Line 1: Line 1:
'''Due date: 11:59 pm 11th March 2015 for submission, 9 am 13th March 2015 for peer-evaluation.'''
+
'''Due date: 11:59 pm 11th March 2015 for submission, 9 am 13th March 2015 for peer-evaluation.''' Everyone, please note that the Daylight Saving Time (United States) 2015 begins at 2:00 AM on Sunday, March 8. People out of the US might want to track the new changes.
  
 
The goals for this week are to:
 
The goals for this week are to:

Revision as of 07:34, 7 March 2015

Due date: 11:59 pm 11th March 2015 for submission, 9 am 13th March 2015 for peer-evaluation. Everyone, please note that the Daylight Saving Time (United States) 2015 begins at 2:00 AM on Sunday, March 8. People out of the US might want to track the new changes.

The goals for this week are to:

  • Learn about needfinding
  • Determine the needs of workers and requesters (from panels and from readings)

Learn about Needfinding

We covered the basics of needfinding in this week's meeting. You may also be interested in viewing Scott Klemmer's HCI lectures on needfinding:

Attend a Panel to Hear from Workers and Requesters

There will be two panels on Hangouts on Air where we will have experienced workers and requesters discussing their experiences on crowdsourcing platforms (MTurk, oDesk, etc). You should to attend one to better understand the needs of workers and requesters:

  • Panel 1: 8:30 am PST (Pacific Time) / 10 pm IST (Indian Time) on Monday March 9th
  • Panel 2: 6 pm PST (Pacific Time) / 9 pm EST (Eastern Time) on Monday March 9th

Deliverable

What were the most salient insights you got from hearing the workers and requesters on the panel? What are their needs?

Readings

Please read the following materials. The papers are copyrighted, so please don't redistribute them. You need to be signed in to view them.

Turkopticon

Irani L C, Silberman M. Turkopticon: Interrupting worker invisibility in amazon mechanical turk. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2013: 611-620.

If you are able to access Mechanical Turk, you can go try Turkopticon yourself here.

Being a Turker

Martin D, Hanrahan B V, O'Neill J, et al. Being a turker. Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing. ACM, 2014: 224-235.

Crowdsourcing User Studies with Mechanical Turk

Kittur A, Chi E H, Suh B. Crowdsourcing user studies with Mechanical Turk. Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 2008: 453-456.

A Plea to Amazon: Fix Mechanical Turk

A Plea to Amazon: Fix Mechanical Turk

The Need for Standardization in Crowdsourcing

The Need for Standardization in Crowdsourcing

Deliverables

For each of the above readings, please answer the following:

What worker needs are discussed or implied by the reading? What requester needs are discussed or implied by the reading?

Recommended (but optional) Materials

These optional materials will help you better understand worker and requester needs:

If you are able to access Mechanical Turk, you can go try Turkopticon yourself here.

The People Inside Your Machine - a 22-minute NPR radio program about crowd workers

Helpful Blog Posts To Help You Design Your HITs - links to several resources aimed towards helping requesters create better HITs

Marshall C C, Shipman F M. Experiences surveying the crowd: Reflections on methods, participation, and reliability. Proceedings of the 5th Annual ACM Web Science Conference. ACM, 2013: 234-243. - discusses using MTurk for surveys (requesters' needs)

Stolee K T, Elbaum S. Exploring the use of crowdsourcing to support empirical studies in software engineering. Proceedings of the 2010 ACM-IEEE International Symposium on Empirical Software Engineering and Measurement. ACM, 2010: 35. - discusses using MTurk for software engineering studies, and challenges recruiting workers with specialized skills (requesters' needs)

Little G, Chilton L B, Goldman M, et al. Turkit: human computation algorithms on mechanical turk. Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 2010: 57-66. - discusses using MTurk programmatically for iterative tasks (requesters' needs)

Salehi N, Irani L C, Bernstein M S. We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers. 2015. - a system for organizing workers towards collective action (workers' needs)

Guidelines for Academic Requesters - A set of guidelines written by Dynamo participants which discusses how to avoid common mistakes made by requesters

Do Needfinding by Browsing MTurk-related forums, blogs, Reddit, etc

There are a number of forums dedicated to Mechanical Turk workers and requesters, such as Turker Nation, MTurk Forum, MTurk Grind, etc.

Requesters also often ask for help and advice on Reddit - you can Search Requester Help on Reddit.

You should browse these resources (you are welcome to find other crowd-work related resources as well), aiming to discover needs that workers and requesters have.

Deliverable

What are some worker and requester needs you discovered while browsing these forums? Summarize the needs, and link to examples (posts / threads).

Synthesize the Needs You Found

You should now have found many needs of workers and requesters. Please synthesize the most interesting ones you found as a set of bullet points (one set of bullet points for worker needs, another set of bullet points for requester needs).

Deliverables

A set of bullet points summarizing the needs of workers.

  • Example: workers need to be respected

A set of bullet points summarizing the needs of requesters.

  • Example: requesters need to trust the results they get from workers

Submitting

Create a Wiki Page for your Team's Submission

Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=Milestone_2_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at Milestone 2 Template . If you have never created a wiki page before, please see this or watch this.

Submit on CrowdGrader and do Peer Evaluations

After you have put your team's submission on the wiki, post the link to the wiki page you created on CrowdGrader!

Step 1 for everyone - For the most of you, you don't have to enroll. I have done it for you. You can directly go to: http://www.crowdgrader.org/crowdgrader/venues/view_venue/879 . However, if you cannot access it, please self-enroll using this link: http://www.crowdgrader.org/crowdgrader/venues/join/879/dufipo_fivuvy_tunyge_qedumy

Step 2 for team leaders: Make sure all of your team members have enrolled into the system (though I have done it for you, please double check). Now add them to your group/team - there's an option to add collaborators (your team members might get an email, and have to confirm before it shows up as your collaborators). Yes, you have to repeat the process, we're working with Crowdgrader, so you don't have to do it every week. However, for now, please give yourself enough time, so you can add collaborators into your team.

Step 3 for team leaders: Make the submission and represent your team, only team leaders should make the submission (unless, its not possible for him/her).

Step 4 for everyone: Begin peer-evaluation. We will NOT send any email notification for this, please check back on Crowdgrader to find submissions to evaluate. Everyone will be randomly assigned 3 submissions to grade (5 in total, you can skip 2), and 25% of your grades depend on your duty to peer-grade others - check Crowdgrader to find and grade the submissions.

Please comment and justify why you gave this score, and point out good/bad points about the submission. For this week, look for the most interesting and insightful needs, see if you can find or infer some of it, and synthesize it with your feedback as something which can be shared with all of us. Team leaders, please make sure that every member of your team grades the submissions.

Milestone 2 Submissions

To help us track all submissions and browsing through them, once you have finished your Milestone 2, go to the link below and post the link:

Milestone 2 Submissions

Fill out this week's survey

Please help us improve our how did you like by filling out this survey.