Instructions to use MilaDeepGraph/ProtST-ESM1b-LocalizationPrediction with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use MilaDeepGraph/ProtST-ESM1b-LocalizationPrediction with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="MilaDeepGraph/ProtST-ESM1b-LocalizationPrediction", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("MilaDeepGraph/ProtST-ESM1b-LocalizationPrediction", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update README.md
#3
by Jiqing - opened
README.md
CHANGED
|
@@ -10,7 +10,7 @@ ProtST for binary localization.
|
|
| 10 |
The following script shows how to finetune ProtST on Gaudi.
|
| 11 |
|
| 12 |
## Running script
|
| 13 |
-
```
|
| 14 |
from transformers import AutoModel, AutoTokenizer, HfArgumentParser, TrainingArguments, Trainer
|
| 15 |
from transformers.data.data_collator import DataCollatorWithPadding
|
| 16 |
from transformers.trainer_pt_utils import get_parameter_names
|
|
|
|
| 10 |
The following script shows how to finetune ProtST on Gaudi.
|
| 11 |
|
| 12 |
## Running script
|
| 13 |
+
```diff
|
| 14 |
from transformers import AutoModel, AutoTokenizer, HfArgumentParser, TrainingArguments, Trainer
|
| 15 |
from transformers.data.data_collator import DataCollatorWithPadding
|
| 16 |
from transformers.trainer_pt_utils import get_parameter_names
|