Difference between revisions of "Milestone 2"

From crowdresearch
Jump to: navigation, search
Line 127: Line 127:
 
* A set of bullet points of the needs of requesters. Example: requesters need to trust the results they get from workers
 
* A set of bullet points of the needs of requesters. Example: requesters need to trust the results they get from workers
  
== Submit on Crowdgrader ==
+
== Submitting ==
  
Crowdgrader submit link and instructions go here
+
=== Create a Wiki Page for your Team's Submission ===
 +
 
 +
Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=Milestone_2_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at [[Milestone 2 Template]] . If you have never created a wiki page before, please see [http://www.mediawiki.org/wiki/Help:Starting_a_new_page this] or watch [https://www.youtube.com/watch?v=83-lCpAnaFw this].
 +
 
 +
=== Submit on CrowdGrader ===
 +
 
 +
After you have put your team's submission on the wiki, post the link to the wiki page you created on <strong>CrowdGrader</strong>!
 +
 
 +
<strong>Step 2 for team leaders</strong>: Make sure all of your team members have enrolled into the system (though I have done it for you, please double check). ''Now add them to your group/team - there's an option to add collaborators'' (your team members might get an email, and have to confirm before it shows up as your collaborators). You also have to keep a check on deadlines, and whether your team is co-operating with you.
 +
 +
<strong>Step 3 for team leaders</strong>: Make the submission and represent your team, only team leaders should make the submission.
 +
 
 +
<strong>Step 4 for team leaders</strong>: Make an entry of your submission page here -> [[Milestone 1 Submissions]]
 +
 +
<strong>Step 5 for everyone</strong>: Begin peer-evaluation. Everyone will be randomly assigned 3 submissions to grade, and 25% of your grades depend on your duty to peer-grade others - check Crowdgrader to find and grade the submissions. - Please comment and justify why you gave this score, and point out good/bad points about the submission. Team leaders, please make sure that every member of your team grades the submissions.
 +
 
 +
== Milestone 1 Submissions ==
 +
 
 +
To help us track all submissions and browsing through them. Once you create your milestone 1 page. Go to the link below and post it.
 +
 
 +
[[Milestone 1 Submissions]]
  
 
== Fill out this week's survey ==
 
== Fill out this week's survey ==
  
 
Please help us improve our how did you like  by filling out this survey.
 
Please help us improve our how did you like  by filling out this survey.

Revision as of 01:56, 7 March 2015

Due date: 11:59 pm 11th March 2015 for submission, 9 am 13th March 2015 for peer-evaluation.

The goals for this week are to:

  • Learn about needfinding
  • Determine the needs of workers and requesters (from panels and from readings)

Learn about Needfinding

We covered the basics of needfinding in this week's meeting. You may also be interested in viewing Scott Klemmer's HCI lectures on needfinding:

Attend a Panel to Hear from Workers and Requesters

There will be two panels on Hangouts on Air where we will have experienced workers and requesters discussing their experiences on crowdsourcing platforms (MTurk, oDesk, etc). You should to attend one to better understand the needs of workers and requesters:

  • Slot 1: 8:30 am PST (Pacific Time) / 10 pm IST (Indian Time) on Monday March 9th
  • Slot 2: 6 pm PST (Pacific Time) / 9 pm EST (Eastern Time) on Monday March 9th

Deliverable

What were the most salient insights you got from attending the panel?

Readings

Please Do needfinding with microtask workers and with requesters. The easy route would be to attend one of several Hangouts on Air led by workers and requesters. If you want to, go ahead and try to talk to people yourselves. If you want to do that, we recommend paying them. If you’re in India, for example, do you know any relatives who used to be on the platform before Amazon shut it down for Indians?

If you absolutely can’t make it, watch the videos.

Milestone: watch Klemmer’s coursera lecture on needfinding — the videos are on YouTube! (15min)

Invite: GigWalk/FancyHands/Wikipedia

Read: Being A Turker, Turkopticon, Panos Blogs, What is a Need

Submission: Needs for workers. Needs for requesters. —> group meeting: create slides that synthesize the main needs on each side

Video

Watch Kelmmer's needfinding slides

Turker Panels

Attend one of the two panels:

Deliverable:

Write a short response about what was the most salient insights you got from it.

For each of the papersWhat needs do the workers have? What were the


What are the needs for turkers that these papers discuss? What are the needs for requesters that are implicit in these papers?

Participate in the forums.

Synthesize needs

for workers and requesters, what are the things that workers need? what are the things that requesters need?

example of worker need: workers need to be respected example of requester need: requesters need to trust the results they get from workers

a set of bullet points for each of workers and requesters


Readings

Please read the following materials. The papers are copyrighted, so please don't redistribute them. You need to be signed in to view them.

Turkopticon

Irani L C, Silberman M. Turkopticon: Interrupting worker invisibility in amazon mechanical turk. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2013: 611-620.

If you are able to access Mechanical Turk, you can go try Turkopticon yourself by installing the Chrome Extension.

Being a Turker

Martin D, Hanrahan B V, O'Neill J, et al. Being a turker. Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing. ACM, 2014: 224-235.

Crowdsourcing User Studies with Mechanical Turk

Kittur A, Chi E H, Suh B. Crowdsourcing user studies with Mechanical Turk. Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 2008: 453-456.

A Plea to Amazon: Fix Mechanical Turk

A Plea to Amazon: Fix Mechanical Turk

The Need for Standardization in Crowdsourcing

The Need for Standardization in Crowdsourcing

Deliverables

For each of the above readings, please answer the following:

What worker needs are discussed or implied by the reading? What requester needs are discussed or implied by the reading?

Recommended (but optional) Materials

These optional materials will help you better understand worker and requester needs:

If you are able to access Mechanical Turk, you can go try Turkopticon yourself by installing the Chrome Extension.

The People Inside Your Machine - a 22-minute NPR radio program about crowd workers

Helpful Blog Posts To Help You Design Your HITs - links to several resources geared towards helping requesters create better HITs

Search Requester Help on Reddit - Requesters often ask for help and advice on Reddit.

Salehi N, Irani L C, Bernstein M S. We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers. 2015. - a system for organizing workers towards collective action

Guidelines for Academic Requesters - A set of guidelines written by Dynamo participants which discusses how to avoid common mistakes made by requesters

Synthesize the Needs You Found

You should now have found many needs of workers and requesters. Please synthesize the most interesting ones you found as a set of bullet points (one set of bullet points for worker needs, another set of bullet points for requester needs).

Deliverables

  • A set of bullet points of the needs of workers. Example: workers need to be respected
  • A set of bullet points of the needs of requesters. Example: requesters need to trust the results they get from workers

Submitting

Create a Wiki Page for your Team's Submission

Please create a page for your team's submission at http://crowdresearch.stanford.edu/w/index.php?title=Milestone_2_YourTeamName&action=edit (substituting in YourTeamName with the team name), copy over the template at Milestone 2 Template . If you have never created a wiki page before, please see this or watch this.

Submit on CrowdGrader

After you have put your team's submission on the wiki, post the link to the wiki page you created on CrowdGrader!

Step 2 for team leaders: Make sure all of your team members have enrolled into the system (though I have done it for you, please double check). Now add them to your group/team - there's an option to add collaborators (your team members might get an email, and have to confirm before it shows up as your collaborators). You also have to keep a check on deadlines, and whether your team is co-operating with you.

Step 3 for team leaders: Make the submission and represent your team, only team leaders should make the submission.

Step 4 for team leaders: Make an entry of your submission page here -> Milestone 1 Submissions

Step 5 for everyone: Begin peer-evaluation. Everyone will be randomly assigned 3 submissions to grade, and 25% of your grades depend on your duty to peer-grade others - check Crowdgrader to find and grade the submissions. - Please comment and justify why you gave this score, and point out good/bad points about the submission. Team leaders, please make sure that every member of your team grades the submissions.

Milestone 1 Submissions

To help us track all submissions and browsing through them. Once you create your milestone 1 page. Go to the link below and post it.

Milestone 1 Submissions

Fill out this week's survey

Please help us improve our how did you like by filling out this survey.