MultiClinNER NL Models

Clinical NER models for NL, trained with Multi-Head CRF architecture.

Best Model

  • Model: CLTL-C64-H3-E60-Arandom-%0.25-P0.2-42
  • Best F1: 0.6898
  • Branch: main

Usage

# Load the best model (main branch)
from transformers import AutoTokenizer, AutoModelForTokenClassification

model = AutoModelForTokenClassification.from_pretrained("IEETA/MultiClinNER-NL")
tokenizer = AutoTokenizer.from_pretrained("IEETA/MultiClinNER-NL")

# Load a specific model variant
model = AutoModelForTokenClassification.from_pretrained("IEETA/MultiClinNER-NL", revision="BRANCH_NAME")

All Models (18 variants)

Branch Model Best?
main CLTL-C64-H3-E60-Arandom-%0.25-P0.2-42 Yes
CLTL-C64-H3-E60-Anone-pct0.2-P0.5-123 CLTL-C64-H3-E60-Anone-%0.2-P0.5-123
CLTL-C64-H3-E60-Anone-pct0.2-P0.5-42 CLTL-C64-H3-E60-Anone-%0.2-P0.5-42
CLTL-C64-H3-E60-Anone-pct0.2-P0.5-456 CLTL-C64-H3-E60-Anone-%0.2-P0.5-456
CLTL-C64-H3-E60-Anone-pct0.2-P0.5-999 CLTL-C64-H3-E60-Anone-%0.2-P0.5-999
CLTL-C64-H3-E60-Arandom-pct0.1-P0.5-123 CLTL-C64-H3-E60-Arandom-%0.1-P0.5-123
CLTL-C64-H3-E60-Arandom-pct0.1-P0.5-42 CLTL-C64-H3-E60-Arandom-%0.1-P0.5-42
CLTL-C64-H3-E60-Arandom-pct0.1-P0.5-456 CLTL-C64-H3-E60-Arandom-%0.1-P0.5-456
CLTL-C64-H3-E60-Arandom-pct0.25-P0.2-123 CLTL-C64-H3-E60-Arandom-%0.25-P0.2-123
CLTL-C64-H3-E60-Arandom-pct0.25-P0.2-999 CLTL-C64-H3-E60-Arandom-%0.25-P0.2-999
CLTL-C64-H3-E60-Aukn-pct0.1-P0.2-123 CLTL-C64-H3-E60-Aukn-%0.1-P0.2-123
CLTL-C64-H3-E60-Aukn-pct0.1-P0.2-42 CLTL-C64-H3-E60-Aukn-%0.1-P0.2-42
CLTL-C64-H3-E60-Aukn-pct0.1-P0.2-456 CLTL-C64-H3-E60-Aukn-%0.1-P0.2-456
CLTL-C64-H3-E60-Aukn-pct0.1-P0.2-999 CLTL-C64-H3-E60-Aukn-%0.1-P0.2-999
CLTL-C64-H3-E60-Aukn-pct0.1-P0.5-123 CLTL-C64-H3-E60-Aukn-%0.1-P0.5-123
CLTL-C64-H3-E60-Aukn-pct0.1-P0.5-42 CLTL-C64-H3-E60-Aukn-%0.1-P0.5-42
CLTL-C64-H3-E60-Aukn-pct0.1-P0.5-456 CLTL-C64-H3-E60-Aukn-%0.1-P0.5-456
CLTL-C64-H3-E60-Aukn-pct0.1-P0.5-999 CLTL-C64-H3-E60-Aukn-%0.1-P0.5-999
Downloads last month
188
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including IEETA/MultiClinNER-NL