nyu-mll/glue
Viewer • Updated • 1.49M • 462k • 495
How to use xysmalobia/sequence_classification with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="xysmalobia/sequence_classification") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("xysmalobia/sequence_classification")
model = AutoModelForSequenceClassification.from_pretrained("xysmalobia/sequence_classification")This model is a fine-tuned version of bert-base-uncased on the glue dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|---|---|---|---|---|---|
| No log | 1.0 | 459 | 0.3519 | 0.8627 | 0.9 |
| 0.4872 | 2.0 | 918 | 0.6387 | 0.8333 | 0.8893 |
| 0.2488 | 3.0 | 1377 | 0.7738 | 0.8529 | 0.8944 |
Base model
google-bert/bert-base-uncased