paper: Don't stop Pretraining: Adapt Language Models to Domains and Tasks GitHub: https://github.com/allenai/dont-stop-pretraining 论文针对预训练语料和领域分布,以及任务分布之间的差异,提出了DAPT领域适应预训练(domain-adaptive pretraining)和TAPT任务适应预训练(task
专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]
Copyright (C)ICode9.com, All Rights Reserved.
ICode9版权所有