Flan-T5 is an enhanced version of Google’s T5 AI model which is quite good at certain language tasks.
For example, it’s supposed to be better at a lot of zero-shot examples even than GPT-3.
Install and Setup Flan-T5
pip install transformers
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer model = AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-small")
tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-small")
Using Flan-T5 for language AI tasks
inputs = tokenizer("A intro paragraph on a article on space travel:", return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
No comments:
Post a Comment