Skip to Content
Deep Learning Semester VII
Course Code: BCS714A
CIE Marks: 50
Teaching Hours/Week (L:T:P: S): 3:0:0:0
SEE Marks: 50
Total Hours of Pedagogy: 40
Total Marks: 100
Credits: 03
Exam Hours: 03
Examination type (SEE): Theory

Introducing Deep Learning: Biological and Machine Vision: Biological Vision, Machine Vision: The Neocognitron, LeNet-5, The Traditional Machine Learning Approach, ImageNet and the ILSVRC, AlexNet, TensorFlow Playground. Human and Machine Language: Deep Learning for Natural Language Processing: Deep Learning Networks Learn Representations Automatically, Natural Language Processing, A Brief History of Deep Learning for NLP, Computational Representations of Language: One-Hot Representations of Words, Word Vectors, Word-Vector Arithmetic, word2viz, Localist Versus Distributed Representations, Elements of Natural Human Language.

Text book 2 : Chapter 1, 2

DOWNLOAD PDF DOWNLOAD WRITTEN

Regularization for Deep Learning: Parameter Norm Penalties, Norm Penalties as Constrained Optimization, Regularization and Under-Constrained Problems, Dataset Augmentation, Noise Robustness, Semi- Supervised Learning, Multi-Task Learning, Early Stopping, Parameter Tying and Parameter Sharing, Sparse Representations, Optimization for Training Deep Models: How Learning Differs from Pure Optimization, Basic Algorithms. Parameter Initialization Strategies, Algorithms with Adaptive Learning Rates.

Text book 1 : Chapter 7 (7.1 to 7.10), Chapter 8 (8.1, 8.3, 8.4, 8.5)

DOWNLOAD PDF DOWNLOAD WRITTEN

Convolution neural networks: The Convolution Operation, Motivation, Pooling, Convolution and Pooling as an Infinitely Strong Prior, Variants of the Basic Convolution Function, Structured Outputs, Data Types, Efficient Convolution Algorithms, Convolutional Networks and the History of Deep Learning.

Text book 1 : Chapter 9 (9.1 to 9.8, 9.11)

DOWNLOAD PDF DOWNLOAD WRITTEN

Sequence Modelling: Recurrent and Recursive Nets: Unfolding Computational Graphs, Recurrent Neural Networks, Bidirectional RNNs, Encoder-Decoder Sequence-to-Sequence Architectures, Deep Recurrent Networks, Recursive Neural Networks. Long short-term memory.

Text book 1 : Chapter 10 (10.1 to 10.6, 10.10)

DOWNLOAD PDF DOWNLOAD WRITTEN

Interactive Applications of Deep Learning: Natural Language Processing: Preprocessing Natural Language Data: Tokenization, Converting All Characters to Lowercase, Removing Stop Words and Punctuation, Stemming, Handling n-grams, Preprocessing the Full Corpus, Creating Word Embeddings with word2vec: The Essential Theory Behind word2vec, Evaluating Word Vectors, Running word2vec, Plotting Word Vectors, The Area under the ROC Curve: The Confusion Matrix, Calculating the ROC AUC Metric, Natural Language Classification with Familiar Networks: Loading the IMDb Film Reviews, Examining the IMDb Data, Standardizing the Length of the Reviews, Dense Network, Convolutional Networks, Networks Designed for Sequential Data: Recurrent Neural Networks, Long Short-Term Memory Units, Bidirectional LSTMs, Stacked Recurrent Models, Seq2seq and Attention, Transfer Learning in NLP.

Text book 2 : Chapter-8

DOWNLOAD PDF DOWNLOAD WRITTEN
2022 SCHEME QUESTION PAPER

Model Set 1 Paper

DOWNLOAD

Model Set 1 Paper Solution

DOWNLOAD

Model Set 2 Paper

DOWNLOAD

Model Set 2 Paper Solution

DOWNLOAD

Regular Paper

DOWNLOAD

Back Paper

DOWNLOAD

Recent Pages