- Background knowledge
- Optimization
- Unsupervised & Generative
- Computer Vision
- Natural Language Processing
- Speech
- Reinforcement
- Gaussian Process
Supervised,Regression - Importance Sampling
Approximate
- Convolutional Neural Networks for Sentence Classification (2014. 8)
- Neural Machine Translation by Jointly Learning to Align and Translate (2014. 9)
- Text Understanding from Scratch (2015. 2)
CNN,Character-level- arXiv
- Ask Me Anything: Dynamic Memory Networks for Natural Language Processing (2015. 6)
- Pointer Networks (2015. 6)
- Skip-Thought Vectors (2015. 6)
- A Neural Conversational Model (2015. 6)
Seq2Seq,Conversation- arXiv
- Teaching Machines to Read and Comprehend (2015. 6)
- Effective Approaches to Attention-based Neural Machine Translation (2015. 8)
- Character-Aware Neural Language Models (2015. 8)
CNN,Character-level- arXiv
- Neural Machine Translation of Rare Words with Subword Units (2015. 8)
- A Diversity-Promoting Objective Function for Neural Conversation Models (2015. 10)
- Multi-task Sequence to Sequence Learning (2015. 11)
- Multilingual Language Processing From Bytes (2015. 12)
- Strategies for Training Large Vocabulary Neural Language Models (2015. 12)
- Recurrent Memory Networks for Language Modeling (2016. 1)
RMN,Memory Bank- arXiv
- Exploring the Limits of Language Modeling (2016. 2)
- Incorporating Copying Mechanism in Sequence-to-Sequence Learning (2016. 3)
- Attention Is All You Need (2017. 6)
- Neural Text Generation: A Practical Guide (2017. 11)
- Recent Advances in Recurrent Neural Networks (2018. 1)
RNN,Recent Advances,Review- arXiv
- Dropout (2012, 2014)
Regulaizer,Ensemble- arXiv (2012) | arXiv (2014) | note
- Batch Normalization (2015)
- Training Very Deep Networks (2015. 7)
- Layer Normalization (2016)
