Docs/Annotation Types

Text & Number Input

Free-form text and numeric input for annotations.

Text & Number Input

Text and number inputs allow annotators to provide free-form responses, useful for corrections, explanations, counts, and measurements.

Text Input

Basic Text Field

Single-line text input:

annotation_schemes:
  - annotation_type: text
    name: correction
    description: "Provide a corrected version of the text"

Textarea (Multi-line)

For longer responses:

- annotation_type: text
  name: explanation
  description: "Explain your reasoning"
  textarea: true

Placeholder Text

Guide annotators with example input:

- annotation_type: text
  name: summary
  description: "Write a one-sentence summary"
  placeholder: "Enter your summary here..."

Character Limits

Constrain response length:

- annotation_type: text
  name: title
  description: "Suggest a title"
  min_length: 10
  max_length: 100

Required Text

Make the field mandatory:

- annotation_type: text
  name: justification
  description: "Why did you choose this label?"
  required: true

Number Input

Basic Number Field

annotation_schemes:
  - annotation_type: number
    name: count
    description: "How many entities are mentioned?"

Range Constraints

Set minimum and maximum values:

- annotation_type: number
  name: rating
  description: "Rate from 1 to 10"
  min: 1
  max: 10

Step Size

Control increment precision:

- annotation_type: number
  name: percentage
  description: "What percentage is relevant?"
  min: 0
  max: 100
  step: 5  # Increments of 5

Decimal Numbers

Allow floating-point values:

- annotation_type: number
  name: score
  description: "Confidence score"
  min: 0.0
  max: 1.0
  step: 0.1

Default Value

Pre-fill with a default:

- annotation_type: number
  name: count
  description: "Number of errors"
  default: 0
  min: 0

Slider Input

Visual alternative to number input:

- annotation_type: slider
  name: confidence
  description: "How confident are you?"
  min: 0
  max: 100
  step: 1

Slider with Labels

Add endpoint labels:

- annotation_type: slider
  name: agreement
  description: "How much do you agree?"
  min: 0
  max: 100
  min_label: "Strongly Disagree"
  max_label: "Strongly Agree"

Slider Display Options

Show the current value:

- annotation_type: slider
  name: rating
  min: 0
  max: 100
  show_value: true

Common Use Cases

Text Correction Task

annotation_schemes:
  - annotation_type: radio
    name: has_error
    description: "Does this text contain errors?"
    labels:
      - "Yes"
      - "No"
 
  - annotation_type: text
    name: corrected_text
    description: "Provide the corrected version"
    textarea: true
    show_if:
      scheme: has_error
      value: "Yes"

Translation Quality

annotation_schemes:
  - annotation_type: slider
    name: adequacy
    description: "How much meaning is preserved?"
    min: 0
    max: 100
    min_label: "None"
    max_label: "All"
 
  - annotation_type: slider
    name: fluency
    description: "How natural does it sound?"
    min: 0
    max: 100
    min_label: "Incomprehensible"
    max_label: "Perfect"
 
  - annotation_type: text
    name: improved_translation
    description: "Suggest a better translation (optional)"
    textarea: true
    required: false

Entity Counting

annotation_schemes:
  - annotation_type: number
    name: person_count
    description: "How many people are mentioned?"
    min: 0
    max: 50
 
  - annotation_type: number
    name: org_count
    description: "How many organizations are mentioned?"
    min: 0
    max: 50
 
  - annotation_type: number
    name: location_count
    description: "How many locations are mentioned?"
    min: 0
    max: 50

Feedback Collection

annotation_schemes:
  - annotation_type: likert
    name: difficulty
    description: "How difficult was this task?"
    size: 5
    min_label: "Very Easy"
    max_label: "Very Hard"
 
  - annotation_type: text
    name: feedback
    description: "Any additional feedback?"
    textarea: true
    required: false
    placeholder: "Share your thoughts..."

Quality Assessment with Justification

annotation_schemes:
  - annotation_type: radio
    name: quality
    description: "Rate the quality"
    labels:
      - Excellent
      - Good
      - Fair
      - Poor
 
  - annotation_type: text
    name: justification
    description: "Explain your rating"
    textarea: true
    required: true
    min_length: 20

Validation

Text Validation

- annotation_type: text
  name: email
  description: "Enter contact email"
  validation:
    pattern: "^[a-zA-Z0-9+_.-]+@[a-zA-Z0-9.-]+$"
    message: "Please enter a valid email address"

Number Validation

Numbers are automatically validated against min/max:

- annotation_type: number
  name: year
  description: "Enter the year"
  min: 1900
  max: 2024
  validation_message: "Year must be between 1900 and 2024"

Keyboard Navigation

Text and number fields support standard keyboard navigation:

  • Tab to move between fields
  • Enter to submit (for single-line text)
  • Arrow keys for number increment/decrement

Output Format

Text and number annotations are saved directly:

{
  "id": "doc1",
  "correction": "The corrected text goes here.",
  "count": 5,
  "confidence": 85
}

Full Example: Document Review

task_name: "Document Review"
 
annotation_schemes:
  # Quality rating
  - annotation_type: likert
    name: quality
    description: "Overall document quality"
    size: 5
    min_label: "Poor"
    max_label: "Excellent"
 
  # Error count
  - annotation_type: number
    name: error_count
    description: "Number of errors found"
    min: 0
    max: 100
    default: 0
 
  # Confidence slider
  - annotation_type: slider
    name: confidence
    description: "How confident are you in this assessment?"
    min: 0
    max: 100
    show_value: true
 
  # Detailed feedback
  - annotation_type: text
    name: errors_found
    description: "List the errors you found"
    textarea: true
    placeholder: "Describe each error..."
 
  # Summary
  - annotation_type: text
    name: summary
    description: "Brief summary of the document"
    max_length: 280
    placeholder: "Summarize in one sentence..."

Best Practices

  1. Use appropriate input types - sliders for continuous values, numbers for precise counts
  2. Set reasonable constraints - min/max values prevent invalid data
  3. Provide placeholders - guide annotators on expected format
  4. Make optional fields clear - use required: false and indicate in description
  5. Use conditional display - show text fields only when needed
  6. Consider validation - use patterns for structured input like emails or IDs