Behavioral Tracking
Capture detailed interaction data for quality analysis and research.
Behavioral Tracking
Potato's behavioral tracking system captures detailed interaction data during annotation sessions, enabling researchers to analyze annotator behavior, timing patterns, AI assistance usage, and decision-making processes.
Overview
The behavioral tracking system captures:
- Every annotation action: Label selections, span annotations, text inputs
- Precise timestamps: Server and client-side timestamps
- AI assistance usage: When suggestions were shown and whether they were accepted
- Focus and timing data: Time spent on each element, scroll depth
- Navigation history: Complete path through instances
What Gets Tracked
Interaction Events
Every user interaction with the annotation interface is captured:
| Event Type | Description | Example Target |
|---|---|---|
click | Mouse clicks on elements | label:positive, nav:next |
focus_in | Element receives focus | textbox:explanation |
focus_out | Element loses focus | label:negative |
keypress | Keyboard shortcuts | key:1, nav:ArrowRight |
navigation | Instance navigation | next, prev, instance_load |
save | Annotation save events | instance:123 |
annotation_change | Label modifications | schema:sentiment |
AI Assistance Usage
Complete lifecycle tracking for AI-assisted annotation:
{
"request_timestamp": 1706500010.0,
"response_timestamp": 1706500012.5,
"schema_name": "sentiment",
"suggestions_shown": ["positive", "neutral"],
"suggestion_accepted": "positive",
"time_to_decision_ms": 3500
}Annotation Changes
Detailed change history for all annotations:
{
"timestamp": 1706500002.5,
"schema_name": "sentiment",
"label_name": "positive",
"action": "select",
"old_value": null,
"new_value": true,
"source": "user"
}Source types:
user- Direct user interactionai_accept- User accepted AI suggestionkeyboard- Keyboard shortcut usedprefill- Pre-filled from configuration
Data Format
Each annotation instance includes a behavioral_data object:
{
"id": "instance_123",
"annotations": {
"sentiment": {"positive": true}
},
"behavioral_data": {
"instance_id": "instance_123",
"session_start": 1706500000.0,
"session_end": 1706500045.0,
"total_time_ms": 45000,
"interactions": [...],
"ai_usage": [...],
"annotation_changes": [...],
"navigation_history": [...],
"focus_time_by_element": {
"label:positive": 2500,
"textbox:explanation": 8000
},
"scroll_depth_max": 75.5
}
}Configuration
Behavioral tracking is enabled by default. No additional configuration is required.
Frontend Debug Mode
To enable debug logging for the interaction tracker:
// In browser console
window.interactionTracker.setDebugMode(true);Analysis Examples
Loading Behavioral Data
import json
from pathlib import Path
def load_behavioral_data(annotation_dir: str) -> dict:
data = {}
for user_dir in Path(annotation_dir).iterdir():
if not user_dir.is_dir():
continue
state_file = user_dir / 'user_state.json'
if state_file.exists():
with open(state_file) as f:
user_state = json.load(f)
user_id = user_state.get('user_id')
behavioral = user_state.get('instance_id_to_behavioral_data', {})
data[user_id] = behavioral
return dataAnalyzing Annotation Time
def analyze_annotation_time(behavioral_data: dict) -> dict:
stats = {}
for user_id, instances in behavioral_data.items():
times = []
for instance_id, bd in instances.items():
if 'total_time_ms' in bd:
times.append(bd['total_time_ms'] / 1000)
if times:
stats[user_id] = {
'mean_time': sum(times) / len(times),
'min_time': min(times),
'max_time': max(times),
'total_instances': len(times)
}
return statsDetecting Suspicious Behavior
def detect_suspicious_annotators(behavioral_data: dict,
min_time_threshold: float = 2.0) -> list:
suspicious = []
for user_id, instances in behavioral_data.items():
fast_count = 0
for instance_id, bd in instances.items():
time_sec = bd.get('total_time_ms', 0) / 1000
if time_sec < min_time_threshold:
fast_count += 1
total = len(instances)
if total > 0:
fast_rate = fast_count / total
if fast_rate > 0.5:
suspicious.append({
'user_id': user_id,
'fast_rate': fast_rate,
'total_instances': total
})
return suspiciousAdmin Dashboard Integration
The Admin Dashboard includes a Behavioral Analytics tab with:
- User Interaction Heatmap: Visual representation of interaction patterns
- AI Assistance Metrics: Accept/reject rates, decision times
- Timing Distribution: Histogram of annotation times
- Suspicious Activity Alerts: Flagged annotators requiring review
API Endpoints
Track Interactions
POST /api/track_interactions
Content-Type: application/json
{
"instance_id": "instance_123",
"events": [...],
"focus_time": {"element": ms},
"scroll_depth": 75.5
}Get Behavioral Data
GET /api/behavioral_data/<instance_id>Returns the complete behavioral data for an instance.
Best Practices
For Researchers
- Baseline Establishment: Collect behavioral data from known-good annotators to establish baselines
- Quality Metrics: Use behavioral data alongside annotation agreement for quality assessment
- Training Evaluation: Compare pre- and post-training behavioral patterns
- AI Impact Analysis: Measure how AI assistance affects annotation quality and speed
For Annotation Projects
- Monitor in Real-Time: Use the admin dashboard to spot issues early
- Set Thresholds: Define acceptable ranges for timing and interaction metrics
- Provide Feedback: Use behavioral insights to provide targeted annotator feedback
Troubleshooting
No Behavioral Data Being Collected
- Verify
interaction_tracker.jsis loaded (check browser Network tab) - Check browser console for JavaScript errors
- Verify API endpoints are accessible (
/api/track_interactions)
Data Not Persisting
- Check that user state is being saved (look for
user_state.json) - Ensure the annotation output directory is writable
Further Reading
- Admin Dashboard - Real-time monitoring
- Annotation History - Detailed change tracking
- Quality Control - Automated quality checks
For implementation details, see the source documentation.