BioBART: Pretraining and Evaluation of A Biomedical Generative Language Model
Paper โข 2204.03905 โข Published โข 4
How to use GanjinZero/biobart-large with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("GanjinZero/biobart-large")
model = AutoModelForSeq2SeqLM.from_pretrained("GanjinZero/biobart-large")Paper: BioBART: Pretraining and Evaluation of A Biomedical Generative Language Model
@misc{BioBART,
title={BioBART: Pretraining and Evaluation of A Biomedical Generative Language Model},
author={Hongyi Yuan and Zheng Yuan and Ruyi Gan and Jiaxing Zhang and Yutao Xie and Sheng Yu},
year={2022},
eprint={2204.03905},
archivePrefix={arXiv}
}