Instructions to use trl-internal-testing/tiny-T5ForConditionalGeneration with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use trl-internal-testing/tiny-T5ForConditionalGeneration with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("trl-internal-testing/tiny-T5ForConditionalGeneration") model = AutoModelForSeq2SeqLM.from_pretrained("trl-internal-testing/tiny-T5ForConditionalGeneration") - Notebooks
- Google Colab
- Kaggle
| library_name: transformers | |
| tags: | |
| - trl | |
| # Tiny T5ForConditionalGeneration | |
| This is a minimal model built for unit tests in the [TRL](https://github.com/huggingface/trl) library. | |