Course Name: 

Deep Learning for NLP





Credits (L-T-P): 



Introduction to NLP and Deep Learning, Language Modeling, History and Applications, Basic Text Processing, Simple Word Vector representations: word2vec, GloVe, Advanced word vector representations: language models, softmax, single layer networks, Neural Networks and back-propagation -- for named entity recognition; Gradient checks, overfitting, regularization, activation functions; Recurrent neural networks -- for language modeling and other tasks; GRUs and LSTMs, Recursive neural networks -- for parsing and other applications; Convolutional neural networks -- for sentence classification; Reinforcement Learning and applications, The future of Deep Learning for NLP: Dynamic Memory Networks.


Li Deng and Dong Yu, "Deep Learning Methods and Applications", Microsoft Research, Foundations and Trends Book, 2014
Josh Patterson and Adam Gibson, "Deep Learning: A Practitioner's Approach" 1st Edition, 2016
Christopher D. Manning and Hinrich Schütze, "Foundations of Statistical Natural Language Processing" MIT Press, 1999
Collobert, Ronan, et al. "Natural language processing (almost) from scratch." Journal of Machine Learning Research 12.Aug (2011): 2493-2537.


Information Technology

Contact us

G. Ram Mohana Reddy

Professor and Head,
Department of Information Technology, NITK, Surathkal,
P. O. Srinivasnagar, Mangalore - 575 025
Karnataka, India.
Ph.:    +91-824-2474056
Email:  infotech[AT]nitk[DOT]ac[DOT]in

Sowmya Kamath S (Web Admin)

Connect with us

We're on Social Networks. Follow us & stay in touch.