Argument Reasoning in Civil Procedure
Legal argument reasoning task requiring annotators to answer multiple-choice questions about civil procedure by selecting the best answer and providing legal reasoning. Based on SemEval-2024 Task 5.
Configuration Fileconfig.yaml
# Argument Reasoning in Civil Procedure
# Based on Bongard et al., SemEval 2024
# Paper: https://aclanthology.org/volumes/2024.semeval-1/
# Dataset: https://github.com/SemEval/semeval-2024-task5
#
# This task asks annotators to answer multiple-choice questions about
# civil procedure law by selecting the best answer from five options
# and providing a brief legal reasoning explanation.
annotation_task_name: "Argument Reasoning in Civil Procedure"
task_dir: "."
data_files:
- sample-data.json
item_properties:
id_key: "id"
text_key: "text"
output_annotation_dir: "annotation_output/"
output_annotation_format: "json"
port: 8000
server_name: localhost
annotation_schemes:
- annotation_type: radio
name: answer_choice
description: "Select the best answer to the civil procedure question."
labels:
- "A"
- "B"
- "C"
- "D"
- "E"
keyboard_shortcuts:
"A": "1"
"B": "2"
"C": "3"
"D": "4"
"E": "5"
tooltips:
"A": "Option A"
"B": "Option B"
"C": "Option C"
"D": "Option D"
"E": "Option E"
- annotation_type: text
name: legal_reasoning
description: "Provide a brief legal reasoning for your answer choice."
annotation_instructions: |
You will be shown a civil procedure question with five answer options. Your task is to:
1. Read the question carefully, noting the legal scenario.
2. Consider all five options.
3. Select the best answer.
4. Provide a brief legal reasoning explaining why your chosen answer is correct.
html_layout: |
<div style="padding: 15px; max-width: 800px; margin: auto;">
<div style="background: #f0f9ff; border: 1px solid #bae6fd; border-radius: 8px; padding: 16px; margin-bottom: 16px;">
<strong style="color: #0369a1;">Question:</strong>
<p style="font-size: 16px; line-height: 1.7; margin: 8px 0 0 0;">{{text}}</p>
</div>
<div style="background: #fefce8; border: 1px solid #fde68a; border-radius: 8px; padding: 16px; margin-bottom: 16px;">
<strong style="color: #a16207;">Options:</strong>
<ul style="font-size: 15px; line-height: 1.8; margin: 8px 0 0 0;">
<li><strong>A:</strong> {{option_a}}</li>
<li><strong>B:</strong> {{option_b}}</li>
<li><strong>C:</strong> {{option_c}}</li>
<li><strong>D:</strong> {{option_d}}</li>
<li><strong>E:</strong> {{option_e}}</li>
</ul>
</div>
</div>
allow_all_users: true
instances_per_annotator: 50
annotation_per_instance: 2
allow_skip: true
skip_reason_required: false
Sample Datasample-data.json
[
{
"id": "civpro_001",
"text": "A plaintiff from State A sues a defendant from State B in federal court for breach of contract. The amount in controversy is $60,000. The defendant moves to dismiss for lack of subject matter jurisdiction. How should the court rule?",
"option_a": "Grant the motion because the amount in controversy does not exceed $75,000",
"option_b": "Deny the motion because diversity jurisdiction exists",
"option_c": "Grant the motion because contract claims must be filed in state court",
"option_d": "Deny the motion because federal question jurisdiction applies",
"option_e": "Grant the motion because the plaintiff failed to join a necessary party"
},
{
"id": "civpro_002",
"text": "A driver from California causes an accident in Nevada. The injured pedestrian, a Nevada resident, files suit in California state court. The driver moves to dismiss for improper venue. What is the most likely outcome?",
"option_a": "The motion is granted because the accident occurred in Nevada",
"option_b": "The motion is denied because the driver is domiciled in California",
"option_c": "The motion is granted because the pedestrian should sue where the injury occurred",
"option_d": "The motion is denied because personal jurisdiction and venue are the same analysis",
"option_e": "The motion is granted because the pedestrian lacks standing in California"
}
]
// ... and 8 more itemsGet This Design
Clone or download from the repository
Quick start:
git clone https://github.com/davidjurgens/potato-showcase.git cd potato-showcase/semeval/2024/task05-argument-civil-procedure potato start config.yaml
Details
Annotation Types
Domain
Use Cases
Tags
Found an issue or want to improve this design?
Open an IssueRelated Designs
BRAINTEASER - Commonsense-Defying QA
Lateral thinking and commonsense-defying question answering task requiring annotators to select answers to brain teasers that defy default commonsense assumptions and provide explanations. Based on SemEval-2024 Task 9 (BRAINTEASER).
Clickbait Spoiling
Classification and extraction of spoilers for clickbait posts, including spoiler type identification and span-level spoiler detection. Based on SemEval-2023 Task 5 (Hagen et al.).
Clinical Trial NLI
Natural language inference for clinical trial reports, determining whether a given statement is entailed or contradicted by clinical trial evidence. Based on SemEval-2023 Task 7 (Jullien et al.).