Course Name: 

Deep Learning for Natural Language Processing (IT930)





Credits (L-T-P): 



Introduction to NLP and Deep Learning, Language Modeling, History and Applications, Basic Text Processing, Simple Word Vector representations: word2vec, GloVe, Advanced word vector representations: language models, softmax, single layer networks, Neural Networks and back-propagation -- for named entity recognition; Gradient checks, overfitting, regularization, activation functions; Recurrent neural networks -- for language modeling and other tasks; GRUs and LSTMs, Recursive neural networks -- for parsing and other applications; Convolutional neural networks -- for sentence classification; Reinforcement Learning and applications, The future of Deep Learning for NLP: Dynamic Memory Networks.


Li Deng and Dong Yu, "Deep Learning Methods and Applications", Microsoft Research, Foundations and Trends Book, 2014
Josh Patterson and Adam Gibson, "Deep Learning: A Practitioner's Approach" 1st Edition, 2016
Christopher D. Manning and Hinrich Schütze, "Foundations of Statistical Natural Language Processing" MIT Press, 1999
Collobert, Ronan, et al. "Natural language processing (almost) from scratch." Journal of Machine Learning Research 12.Aug (2011): 2493-2537.


Information Technology

Contact us

Head of the Department,
Department of Information Technology,
National Institute of Technology Karnataka,
SurathkalP. O. Srinivasnagar, Mangalore - 575 025
Ph.:    +91-824-2474056
Email:  hodit [at] nitk [dot] edu [dot] in

Web Admin: Sowmya Kamath S

Connect with us

We're on Social Networks. Follow us & stay in touch.