算法
Numpy
TensorFlow
- tf.einsum
- tensor-to-tensor[理論篇]
- https://zhuanlan.zhihu.com/p/32870503
深度學習
- Implementing a CNN for Text Classification in TensorFlow
- Understanding Convolutional Neural Networks for NLP
- RNNs in Tensorflow, a Practical Guide and Undocumented Features
- Attention and Memory in Deep Learning and NLP
- 卷積神經網絡CNN在自然語言處理中的應用
- 大牛教程
- ATTENTION MECHANISM
- Text Classification, Part 3 - Hierarchical attention network
- 《Attention is All You Need》淺讀
RNN結構和實現
- LSTM神經網絡輸入輸出究竟是怎樣的?
- 雙向循環神經網絡及TensorFlow實現
- Understanding LSTM Networks
- The Unreasonable Effectiveness of Recurrent Neural Networks
- Recurrent Neural Networks Tutorial, Part 1 – Introduction to RNNs
- RNN加上Attention
- RNN 循環神經網絡系列 4: 注意力機制
- Understanding LSTM and its diagrams
深度學習理論
- 深度學習最全優化方法總結比較(SGD,Adagrad,Adadelta,Adam,Adamax,Nadam)
- Softmax函數與交叉熵
- Soft & hard attention
- Deep learning - Computation & optimization.
- Deep learning - Linear algebra.
- Attention? Attention!
- The Transformer – Attention is all you need.