Running Annotation Studies on Prolific
Complete guide to integrating Potato with Prolific for crowdsourced annotation, including payment and quality control.
Running Annotation Studies on Prolific
Prolific provides access to a diverse, pre-vetted participant pool for research studies. This guide covers everything from setup to payment for running annotation tasks on Prolific.
Why Prolific?
- Research-focused: Designed for academic studies
- Quality participants: Pre-screened, attentive workers
- Fair pay: Minimum wage requirements
- Diverse demographics: Filter by many criteria
- IRB-friendly: Consent management built-in
Prerequisites
- Prolific researcher account
- Potato installed and accessible via public URL
- Basic configuration ready
Basic Integration
Potato Configuration
annotation_task_name: "Research Annotation Study"
# Prolific integration via URL-based login
login:
type: url_direct
url_argument: PROLIFIC_PID
# Completion redirect
finish:
redirect_url: "https://app.prolific.co/submissions/complete?cc=XXXXXX"Prolific Study Setup
- Create new study on Prolific
- Study URL:
https://your-server.com/annotate?PROLIFIC_PID={{%PROLIFIC_PID%}}&STUDY_ID={{%STUDY_ID%}}&SESSION_ID={{%SESSION_ID%}} - Completion URL: Get from Potato after task completion
Complete Integration Configuration
annotation_task_name: "Sentiment Analysis - Prolific Study"
# Prolific integration via URL-based login
login:
type: url_direct
url_argument: PROLIFIC_PID
# Completion handling
finish:
redirect_url: "https://app.prolific.co/submissions/complete?cc=C1A2B3C4"
thank_you_message: "Thank you for participating! Your completion code is shown below."
# Data and task settings
"data_files": ["data/texts.json"]
# Quality control
quality_control:
attention_checks:
enabled: true
frequency: 10 # Every 10th item
fail_threshold: 2 # Fail after 2 wrong
action_on_fail: end_session
minimum_time_per_item: 5 # At least 5 seconds
flag_fast_responses: true
annotation_schemes:
- annotation_type: radio
name: sentiment
labels: [Positive, Negative, Neutral]
required: true
# Study phases for consent, instructions, and training
phases:
- name: consent
type: consent
title: "Informed Consent"
content: |
## Study Information
**Purpose**: This study collects sentiment annotations for NLP research.
**Task**: You will read short texts and classify their sentiment.
**Duration**: Approximately 20-30 minutes
**Compensation**: $X.XX via Prolific
**Risks**: Minimal - you may encounter mildly negative content
**Data Use**: Responses will be anonymized and used for research
**Contact**: researcher@university.edu
By clicking "I Agree", you consent to participate.
require_checkbox: true
checkbox_text: "I have read and understand the above information"
agree_button: "I Agree - Begin Study"
decline_button: "I Do Not Agree - Exit"
- name: instructions
type: instructions
title: "Task Instructions"
content: |
## Your Task
Read each text and classify its sentiment as:
- **Positive**: Happy, satisfied, enthusiastic
- **Negative**: Sad, angry, disappointed
- **Neutral**: Factual, no clear emotion
## Important Notes
- Read each text carefully
- There are attention checks - please stay focused
- You have 2 minutes per item maximum
- Complete all 50 items to receive payment
- name: training
type: training
items: 5
require_pass: true
pass_threshold: 0.8
feedback: true
max_attempts: 2
- name: main_task
type: annotationSetting Up the Study on Prolific
Step 1: Create Study
- Go to Prolific Dashboard → New Study
- Fill in study details:
- Title: Clear, descriptive name
- Description: What participants will do
- Estimated time: Be accurate (affects pay rate display)
Step 2: Configure Study URL
https://your-server.com/annotate?PROLIFIC_PID={{%PROLIFIC_PID%}}&STUDY_ID={{%STUDY_ID%}}&SESSION_ID={{%SESSION_ID%}}
Step 3: Set Completion Code
- In Potato config, set a unique completion code
- In Prolific, use the redirect completion URL:
https://app.prolific.co/submissions/complete?cc=YOUR_CODE
Step 4: Participant Requirements
Filter participants by:
- Demographics: Age, gender, nationality
- Language: First language, fluency
- Approval rate: Minimum past approval %
- Custom: Previous study participation
Step 5: Payment
- Set fair compensation (Prolific requires minimum wage)
- Calculate: (estimated_time_minutes / 60) × hourly_rate
- Consider adding bonus for quality
Handling Participant Flow
Use the phases section to define the flow of your study:
phases:
- name: consent
type: consent
title: "Informed Consent"
content: "Your consent form content here..."
- name: instructions
type: instructions
title: "Task Instructions"
content: "Your instructions here..."
- name: training
type: training
items: 5
require_pass: true
pass_threshold: 0.8
- name: main_task
type: annotationQuality Control for Crowdsourcing
Attention Checks
quality_control:
attention_checks:
enabled: true
frequency: 10
items:
- text: "Please select 'Positive' for this item."
expected: "Positive"
- text: "ATTENTION CHECK: Select 'Negative'"
expected: "Negative"
on_fail:
first_fail: warn
second_fail: end_session
fail_message: "Please read items more carefully."Time-Based Quality
quality_control:
timing:
min_time_per_item: 3 # seconds
flag_below: 2
block_below: 1
max_time_per_item: 300
warn_above: 120Response Quality
quality_control:
response_quality:
check_consistency: true
consistency_items: 5 # Repeat 5 items
max_inconsistency: 0.3
check_distribution: true
flag_single_label_ratio: 0.9 # Flag if >90% same labelHandling Rejections
Quality control failures can be handled through the quality_control section. Participants who fail attention checks can be redirected appropriately.
Payment and Bonuses
Bonuses are managed through Prolific's interface. After your study completes, export your annotation data and calculate bonuses based on your criteria, then upload them to Prolific.
Monitoring Your Study
Monitor your study through:
- Potato's built-in admin dashboard
- Prolific's study monitoring interface
- Your annotation output files
Output Format
{
"participant_id": "PROLIFIC_PID_XXXXX",
"study_id": "STUDY_ID_XXXXX",
"session_id": "SESSION_ID_XXXXX",
"annotations": [...],
"metadata": {
"start_time": "2026-01-20T10:00:00Z",
"end_time": "2026-01-20T10:25:00Z",
"duration_minutes": 25,
"items_completed": 50,
"attention_checks_passed": 5,
"attention_checks_failed": 0
},
"quality_metrics": {
"avg_time_per_item": 28.5,
"consistency_score": 0.92
}
}Tips for Prolific Studies
- Pilot first: Run with 5-10 participants to test
- Fair pay: Prolific participants expect research-level rates
- Clear instructions: Reduce confusion and rejections
- Mobile-friendly: Some participants use phones
- Quick approval: Approve submissions promptly
Next Steps
- Learn about MTurk deployment for comparison
- Set up quality control in detail
- Calculate inter-annotator agreement
Full crowdsourcing documentation at /docs/deployment/crowdsourcing.