Serayuki-AI
Collection
2 items • Updated
Tokenizer is based on meta-llama/Llama-3.2-3B-Instruct.
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Vynie/Serayuki-1B-v1.1-pre2-step-3k-1B")
model = AutoModelForCausalLM.from_pretrained("Vynie/Serayuki-1B-v1.1-pre2-step-3k-1B")
prompt = "Once upon a time, my friends and I wanted to"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
outputs = model.generate(
**inputs,
max_length=256
)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
This project is licensed under the MIT License.