👨🎓 This repo is a supplement to my video on Transformers and Text Summarization as part of my series AI does AI (https://youtu.be/p_6xgrykPMQ)
-
Updated
Jan 23, 2021 - Python
👨🎓 This repo is a supplement to my video on Transformers and Text Summarization as part of my series AI does AI (https://youtu.be/p_6xgrykPMQ)
A NLP question answering service leveraging user reviews
Simple from-scratch implementations of transformer-based models that match the state of the art.
Some experiments to compare the performances of some pre-trained transformer models on a basic sentiment regression task
BERT (Bidirectional Encoder Representations from Transformers) is a transformer-based method of learning language representations. It is a bidirectional transformer pre-trained model developed using a combination of two tasks namely: masked language modeling objective and next sentence prediction on a large corpus.
Generate text under lexical constraints.
embeddings language models
Emotional Analysis of Internet News hyper-parameter tuning framework,CCF大数据与计算智能大赛互联网新闻情感分析训练框架,附调参日志
Simple Text Classification[WIP]
An automated solution for fact-checking using available claims and fake-news datasets to fine-tune state-of-the-art language models published recently for NLP tasks (BERT, RoBERTa, XLNet, ConvBERT...) in order to classify unseen claims.
pretrained transformer and embeddings language models
Machine learning (ML) solution that review end-user license agreements (EULA) for terms and conditions that are unacceptable to the government
Homework of Applied Deep Learning (ADL Lectured by Yun-Nung Chen at NTU)
Generalizing Question Answering System with Pre-trained Language Model Fine-tuning.
Final Project of Applied Deep Learning (ADL Lectured by Yun-Nung Chen at NTU)
Implementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Add a description, image, and links to the xlnet topic page so that developers can more easily learn about it.
To associate your repository with the xlnet topic, visit your repo's landing page and select "manage topics."