PathologyBERT -- Pre-trained Vs. A New Transformer Language Model for Pathology Domain
BibTex
Copy
@misc{santosFri May 13 2022 20:42:07 GMT+0000 (Coordinated Universal Time)pathologybertpretrainedvs,
title={PathologyBERT -- Pre-trained Vs. A New Transformer Language Model for Pathology Domain},
author={Thiago Santos and Amara Tariq and Susmita Das and Kavyasree Vayalpati and Geoffrey H. Smith and Hari Trivedi and Imon Banerjee},
year={Fri May 13 2022 20:42:07 GMT+0000 (Coordinated Universal Time)},
eprint={2205.06885},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2205.06885},
}
Transform this paper into an audio lecture
Get an engaging lecture and Q&A format to quickly understand the paper in minutes, perfect for learning on the go.