References


General Language Model

Goal : Build a general,pretrained language representation model

Why : The model can be adapted to various NLP tasks easily, we dont have to retain a new model from scratch every time

https://www.youtube.com/watch?v=BhlOGGzC0Q0

https://www.youtube.com/watch?v=BhlOGGzC0Q0

Context is everything!

Abstract

ref:https://jalammar.github.io/images/bert-transfer-learning.png

ref:https://jalammar.github.io/images/bert-transfer-learning.png

BERT → Bidirectional Encoder Representations from Transformers