Skip to content

Latest commit

 

History

History
23 lines (12 loc) · 1.76 KB

REFERENCES.md

File metadata and controls

23 lines (12 loc) · 1.76 KB

DNN models

References:

[1]. Dai, A. M., & Le, Q. V. (2015). Semi-supervised sequence learning. In Advances in Neural Information Processing Systems (pp. 3079-3087). https://papers.nips.cc/paper/5949-semi-supervised-sequence-learning.pdf

[2]. Zhou, X., Wan, X., & Xiao, J. Attention-based LSTM Network for Cross-Lingual Sentiment Classification. https://aclweb.org/anthology/D/D16/D16-1024.pdf

[3]. Liu, P., Qiu, X., & Huang, X. (2016). Recurrent neural network for text classification with multi-task learning. arXiv preprint arXiv:1605.05101. https://arxiv.org/pdf/1605.05101.pdf

Cross-language embeddings

References:

[1]. AP, S. C., Lauly, S., Larochelle, H., Khapra, M., Ravindran, B., Raykar, V. C., & Saha, A. (2014). An autoencoder approach to learning bilingual word representations. In Advances in Neural Information Processing Systems (pp. 1853-1861). http://www.sarathchandar.in/paper/SarathNIPS2014

[2]. Xiao, M., & Guo, Y. (2013). Semi-Supervised Representation Learning for Cross-Lingual Text Classification. In EMNLP (pp. 1465-1475). https://www.aclweb.org/anthology/D/D13/D13-1153.pdf

[3]. Upadhyay, S., Faruqui, M., Dyer, C., & Roth, D. (2016). Cross-lingual models of word embeddings: An empirical comparison. arXiv preprint arXiv:1604.00425. https://arxiv.org/pdf/1604.00425.pdf

[4]. Pham, H., Luong, M. T., & Manning, C. D. (2015, June). Learning distributed representations for multilingual text sequences. In Proceedings of NAACL-HLT (pp. 88-94). http://www.aclweb.org/website/old_anthology/W/W15/W15-15.pdf#page=100

[5]. Vulić, I., & Moens, M. F. (2016). Bilingual distributed word representations from document-aligned comparable data. Journal of Artificial Intelligence Research, 55, 953-994. http://www.jair.org/media/4986/live-4986-9243-jair.pdf