Skip to content
#

xlnet

Here are 55 public repositories matching this topic...

BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.

  • Updated Aug 10, 2020
  • Python

An automated solution for fact-checking using available claims and fake-news datasets to fine-tune state-of-the-art language models published recently for NLP tasks (BERT, RoBERTa, XLNet, ConvBERT...) in order to classify unseen claims.

  • Updated Aug 28, 2022
  • Python

Improve this page

Add a description, image, and links to the xlnet topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the xlnet topic, visit your repo's landing page and select "manage topics."

Learn more