Structured Sentiment Analysis
Fine-grained structured sentiment annotation identifying opinion holders, targets, and polar expressions with their polarity, based on SemEval-2022 Task 10 (Barnes et al., SemEval 2022). Annotators mark opinion tuples by highlighting spans and classifying polarity.
Archivo de configuraciónconfig.yaml
# Structured Sentiment Analysis
# Based on Barnes et al., SemEval 2022
# Paper: https://aclanthology.org/2022.semeval-1.180/
# Dataset: https://github.com/jerbarnes/semeval22_structured_sentiment
#
# This task implements structured sentiment analysis where annotators
# identify the components of opinion expressions: who holds the opinion
# (Source/Holder), what the opinion is about (Target), and the evaluative
# expression itself (Polar Expression). Additionally, annotators classify
# the overall polarity of each opinion.
#
# Span Labels:
# - Source/Holder: The entity expressing the opinion (person, organization)
# - Target: The entity or aspect being evaluated
# - Polar Expression: The word(s) carrying the evaluative sentiment
#
# Polarity Labels:
# - Positive: Favorable opinion or evaluation
# - Negative: Unfavorable opinion or evaluation
# - Mixed: Both positive and negative elements present
# - Neutral: Factual or balanced without clear sentiment
#
# Annotation Guidelines:
# 1. Read the text and identify all opinion expressions
# 2. For each opinion, highlight the source/holder, target, and polar expression
# 3. Classify the overall polarity of the dominant opinion
# 4. A sentence may contain multiple opinions with different polarities
# 5. Mark all relevant spans even if some roles are implicit
annotation_task_name: "Structured Sentiment Analysis"
task_dir: "."
data_files:
- sample-data.json
item_properties:
id_key: "id"
text_key: "text"
output_annotation_dir: "annotation_output/"
output_annotation_format: "json"
port: 8000
server_name: localhost
annotation_schemes:
# Step 1: Highlight opinion components
- annotation_type: span
name: opinion_spans
description: "Highlight the opinion components in the text"
labels:
- "Source/Holder"
- "Target"
- "Polar Expression"
tooltips:
"Source/Holder": "The entity expressing the opinion (e.g., a person, reviewer, or organization)"
"Target": "The entity, product, or aspect being evaluated or discussed"
"Polar Expression": "The word(s) that carry evaluative sentiment (e.g., 'excellent', 'terrible')"
# Step 2: Classify overall polarity
- annotation_type: radio
name: polarity
description: "What is the overall polarity of the dominant opinion in this text?"
labels:
- "Positive"
- "Negative"
- "Mixed"
- "Neutral"
keyboard_shortcuts:
"Positive": "1"
"Negative": "2"
"Mixed": "3"
"Neutral": "4"
tooltips:
"Positive": "The dominant opinion is favorable or approving"
"Negative": "The dominant opinion is unfavorable or disapproving"
"Mixed": "The text contains both positive and negative opinions of comparable strength"
"Neutral": "The text is factual or balanced without clear evaluative sentiment"
annotation_instructions: |
You will be shown text containing opinions. Your task is to:
1. Identify and highlight the components of each opinion using span annotation:
- **Source/Holder**: Who is expressing the opinion?
- **Target**: What is the opinion about?
- **Polar Expression**: What words carry the evaluative sentiment?
2. Classify the overall polarity of the dominant opinion in the text.
A single sentence may contain multiple opinions. Try to mark all relevant spans.
If a role is implicit (e.g., the author is the implied holder), you may skip that span.
html_layout: |
<div style="padding: 15px; max-width: 800px; margin: auto;">
<div style="background: #f0f9ff; border: 1px solid #bae6fd; border-radius: 8px; padding: 16px; margin-bottom: 16px;">
<strong style="color: #0369a1;">Text:</strong>
<p style="font-size: 16px; line-height: 1.7; margin: 8px 0 0 0;">{{text}}</p>
</div>
</div>
allow_all_users: true
instances_per_annotator: 50
annotation_per_instance: 2
allow_skip: true
skip_reason_required: false
Datos de ejemplosample-data.json
[
{
"id": "sentiment_001",
"text": "The critics praised the director's bold vision, calling the film a masterpiece of modern cinema."
},
{
"id": "sentiment_002",
"text": "Customers have complained that the new software update is slow and buggy, making it nearly impossible to complete basic tasks."
}
]
// ... and 8 more itemsObtener este diseño
Clone or download from the repository
Inicio rápido:
git clone https://github.com/davidjurgens/potato-showcase.git cd potato-showcase/text/computational-social-science/structured-sentiment potato start config.yaml
Detalles
Tipos de anotación
Dominio
Casos de uso
Etiquetas
¿Encontró un problema o desea mejorar este diseño?
Abrir un issueDiseños relacionados
Aspect-Based Sentiment Analysis
Identification of aspect terms in review text with sentiment polarity classification for each aspect. Based on SemEval-2016 Task 5 (ABSA).
Aspect-Based Sentiment Analysis (Original ABSA)
Identify aspect terms in review text and classify their sentiment polarity, based on SemEval-2014 Task 4 (Pontiki et al.). Annotators highlight aspect terms and assign sentiment labels across restaurant and laptop review domains.
Causal Medical Claim Detection and PICO Extraction
Detection of causal claims in medical texts and extraction of PICO (Population, Intervention, Comparator, Outcome) elements. Based on SemEval-2023 Task 8 (Khetan et al.).