Skip to content
Guides4 min read

Running Annotation Studies on Prolific

Complete guide to integrating Potato with Prolific for crowdsourced annotation, including payment and quality control.

Potato Team·

Running Annotation Studies on Prolific

Prolific provides access to a diverse, pre-vetted participant pool for research studies. This guide covers everything from setup to payment for running annotation tasks on Prolific.

Why Prolific?

  • Research-focused: Designed for academic studies
  • Quality participants: Pre-screened, attentive workers
  • Fair pay: Minimum wage requirements
  • Diverse demographics: Filter by many criteria
  • IRB-friendly: Consent management built-in

Prerequisites

  1. Prolific researcher account
  2. Potato installed and accessible via public URL
  3. Basic configuration ready

Basic Integration

Potato Configuration

yaml
annotation_task_name: "Research Annotation Study"
 
# Prolific integration via URL-based login
login:
  type: url_direct
  url_argument: PROLIFIC_PID
 
# Completion redirect
finish:
  redirect_url: "https://app.prolific.co/submissions/complete?cc=XXXXXX"

Prolific Study Setup

  1. Create new study on Prolific
  2. Study URL: https://your-server.com/annotate?PROLIFIC_PID={{%PROLIFIC_PID%}}&STUDY_ID={{%STUDY_ID%}}&SESSION_ID={{%SESSION_ID%}}
  3. Completion URL: Get from Potato after task completion

Complete Integration Configuration

yaml
annotation_task_name: "Sentiment Analysis - Prolific Study"
 
# Prolific integration via URL-based login
login:
  type: url_direct
  url_argument: PROLIFIC_PID
 
# Completion handling
finish:
  redirect_url: "https://app.prolific.co/submissions/complete?cc=C1A2B3C4"
  thank_you_message: "Thank you for participating! Your completion code is shown below."
 
# Data and task settings
"data_files": ["data/texts.json"]
 
annotation_schemes:
  - annotation_type: radio
    name: sentiment
    labels: [Positive, Negative, Neutral]
    required: true
 
# Multi-phase workflow using surveyflow
surveyflow:
  on: true
  order:
    - consent
    - prestudy
    - annotation
    - poststudy
 
  consent:
    data_file: "data/consent.json"
 
  prestudy:
    data_file: "data/instructions.json"
 
instances_per_annotator: 50

Setting Up the Study on Prolific

Step 1: Create Study

  1. Go to Prolific Dashboard → New Study
  2. Fill in study details:
    • Title: Clear, descriptive name
    • Description: What participants will do
    • Estimated time: Be accurate (affects pay rate display)

Step 2: Configure Study URL

text
https://your-server.com/annotate?PROLIFIC_PID={{%PROLIFIC_PID%}}&STUDY_ID={{%STUDY_ID%}}&SESSION_ID={{%SESSION_ID%}}

Step 3: Set Completion Code

  1. In Potato config, set a unique completion code
  2. In Prolific, use the redirect completion URL:
    text
    https://app.prolific.co/submissions/complete?cc=YOUR_CODE
    

Step 4: Participant Requirements

Filter participants by:

  • Demographics: Age, gender, nationality
  • Language: First language, fluency
  • Approval rate: Minimum past approval %
  • Custom: Previous study participation

Step 5: Payment

  • Set fair compensation (Prolific requires minimum wage)
  • Calculate: (estimated_time_minutes / 60) × hourly_rate
  • Consider adding bonus for quality

Handling Participant Flow

Use Potato's surveyflow to define the flow of your study:

yaml
surveyflow:
  on: true
  order:
    - consent
    - prestudy
    - annotation
    - poststudy
 
  consent:
    data_file: "data/consent.json"
 
  prestudy:
    data_file: "data/instructions.json"
 
  poststudy:
    data_file: "data/feedback.json"

Quality Control for Crowdsourcing

Attention Checks

You can include gold-standard items in your data file with known correct answers. Configure attention checks by including items with the gold field in your data:

json
{
  "id": "attention_1",
  "text": "Please select 'Positive' for this item.",
  "gold": {"sentiment": "Positive"}
}

Monitoring Quality

Monitor annotator quality through Potato's admin dashboard, which shows completion rates and annotation statistics. You can also review the output annotation files to check for patterns like single-label bias or unusually fast completion times.

Handling Rejections

Quality control failures can be handled through the quality_control section. Participants who fail attention checks can be redirected appropriately.

Payment and Bonuses

Bonuses are managed through Prolific's interface. After your study completes, export your annotation data and calculate bonuses based on your criteria, then upload them to Prolific.

Monitoring Your Study

Monitor your study through:

  • Potato's built-in admin dashboard
  • Prolific's study monitoring interface
  • Your annotation output files

Output Format

json
{
  "participant_id": "PROLIFIC_PID_XXXXX",
  "study_id": "STUDY_ID_XXXXX",
  "session_id": "SESSION_ID_XXXXX",
  "annotations": [...],
  "metadata": {
    "start_time": "2026-01-20T10:00:00Z",
    "end_time": "2026-01-20T10:25:00Z",
    "duration_minutes": 25,
    "items_completed": 50,
    "attention_checks_passed": 5,
    "attention_checks_failed": 0
  },
  "quality_metrics": {
    "avg_time_per_item": 28.5,
    "consistency_score": 0.92
  }
}

Tips for Prolific Studies

  1. Pilot first: Run with 5-10 participants to test
  2. Fair pay: Prolific participants expect research-level rates
  3. Clear instructions: Reduce confusion and rejections
  4. Mobile-friendly: Some participants use phones
  5. Quick approval: Approve submissions promptly

Next Steps


Full crowdsourcing documentation at /docs/deployment/crowdsourcing.