https://arxiv.org/pdf/1810.04805.pdf

Abstract

Introduction

Related work - 앞선 연구들의 문제점 지적

Unsupervised Fine-tuning 부분에서 두가지 접근법이 나옴

Untitled

BERT

There are two steps in BERT framework: pre-training and fine-tuning

Model Architecture