Counterfactual Detection and Reasoning
Detect counterfactual statements and annotate their antecedent and consequent spans, based on SemEval-2020 Task 5 (Yang et al.). Annotators identify whether a statement describes a hypothetical situation counter to known facts.
Archivo de configuraciónconfig.yaml
# Counterfactual Detection and Reasoning
# Based on Yang et al., SemEval 2020
# Paper: https://aclanthology.org/2020.semeval-1.40/
# Dataset: https://github.com/arielsho/SemEval-2020-Task-5
#
# Annotators classify whether a statement is counterfactual (describing a
# hypothetical scenario contrary to known facts) and mark the antecedent
# (the hypothetical condition) and consequent (the hypothetical result).
annotation_task_name: "Counterfactual Detection and Reasoning"
task_dir: "."
data_files:
- sample-data.json
item_properties:
id_key: "id"
text_key: "text"
output_annotation_dir: "annotation_output/"
output_annotation_format: "json"
port: 8000
server_name: localhost
annotation_schemes:
- annotation_type: radio
name: counterfactual_label
description: "Is this statement counterfactual?"
labels:
- "Counterfactual"
- "Not Counterfactual"
keyboard_shortcuts:
"Counterfactual": "1"
"Not Counterfactual": "2"
tooltips:
"Counterfactual": "The statement describes a hypothetical situation that is contrary to known facts"
"Not Counterfactual": "The statement does not describe a counterfactual scenario"
- annotation_type: span
name: counterfactual_spans
description: "If counterfactual, highlight the antecedent (condition) and consequent (result)."
labels:
- "Antecedent"
- "Consequent"
annotation_instructions: |
You will see a statement that may or may not be counterfactual. Your task is to:
1. Read the statement carefully.
2. Determine if it is counterfactual (describes a hypothetical situation contrary to fact).
3. If counterfactual, highlight the antecedent (the "if" condition) and the consequent
(the "then" result).
Examples of counterfactual: "If I had studied harder, I would have passed the exam."
Examples of non-counterfactual: "If it rains tomorrow, I will bring an umbrella."
html_layout: |
<div style="padding: 15px; max-width: 800px; margin: auto;">
<div style="background: #f0f9ff; border: 1px solid #bae6fd; border-radius: 8px; padding: 16px; margin-bottom: 16px;">
<strong style="color: #0369a1;">Statement:</strong>
<p style="font-size: 16px; line-height: 1.7; margin: 8px 0 0 0;">{{text}}</p>
</div>
</div>
allow_all_users: true
instances_per_annotator: 50
annotation_per_instance: 2
allow_skip: true
skip_reason_required: false
Datos de ejemplosample-data.json
[
{
"id": "cf_001",
"text": "If the Titanic had carried enough lifeboats, hundreds more passengers would have survived the disaster."
},
{
"id": "cf_002",
"text": "The company announced that quarterly earnings exceeded analyst expectations by a significant margin."
}
]
// ... and 8 more itemsObtener este diseño
Clone or download from the repository
Inicio rápido:
git clone https://github.com/davidjurgens/potato-showcase.git cd potato-showcase/semeval/2020/task05-counterfactual potato start config.yaml
Detalles
Tipos de anotación
Dominio
Casos de uso
Etiquetas
¿Encontró un problema o desea mejorar este diseño?
Abrir un issueDiseños relacionados
Aspect-Based Sentiment Analysis
Identification of aspect terms in review text with sentiment polarity classification for each aspect. Based on SemEval-2016 Task 5 (ABSA).
Causal Medical Claim Detection and PICO Extraction
Detection of causal claims in medical texts and extraction of PICO (Population, Intervention, Comparator, Outcome) elements. Based on SemEval-2023 Task 8 (Khetan et al.).
Character Identification on Multiparty Dialogues
Identification and linking of character mentions in TV show dialogue, combining span annotation with entity resolution for the main cast of Friends. Based on SemEval-2018 Task 4.