Sixth Semester

Selction Image

Neural Networks

Course Title: Neural Networks

Course No.: CSC372

Course Nature: Theory + Lab

Semester: Sixth

Year: Third

Full Marks: 60 + 20 + 20

Pass Marks: 24 + 8 + 8

Credit Hours: 3

Course Description: The course introduces the underlying principles and design of Neural Network. The course covers the basics concepts of Neural Network including: its architecture, learning processes, single layer and multilayer perceptron followed by Recurrent Neural Network

Course Objectives: The course objective is to demonstrate the concept of supervised learning, unsupervised learning in conjunction with different architectures of Neural Network.

Contents of Chapter

Unit 1: Introduction to Neural Network (4 Hrs.)

Basics of neural networks and human brain, Models of a neuron, Neural Network viewed as Directed Graphs, Feedback, Network Architectures, Knowledge Representation, Learning Processes, Learning Tasks

Unit 2: Rosenblatt’s Perceptron (3 Hrs.)

Introduction, Perceptron, The Perceptron Convergence Theorem, Relation between the Perceptron and Bayes Classifier for a Gaussian Environment, The Batch Perceptron Algorithm

Unit 3: Model Building through Regression(5 Hrs.)

Introduction, Linear Regression Model: Preliminary Considerations, Maximum a Posteriori Estimation of the Parameter Vector, Relationship Between Regularized Least-Squares Estimation and Map Estimation, Computer Experiment: Pattern Classification, The Minimum-Description-Length Principle, Finite Sample-Size Considerations, The instrumental-Variables Method

Unit 4: The Least-Mean-Square Algorithm(5 Hrs.)

Introduction, Filtering Structure of the LMS Algorithm, Unconstrained Optimization: A Review, The Wiener Filter, The Least-Mean-Square Algorithm, Markov Model Portraying the Deviation of the LMS Algorithm from the Wiener Filter, The Langevin Equation: Characterization of Brownian Motion, Kushner‟s Direct-Averaging Method, Statistical LMS Learning Theory for Small Learning-Rate Parameter, Virtues and Limitations of the LMS Algorithm, Learning-Rate Annealing Schedules

Unit 5: Multilayer Perceptron(8 Hrs.)

Introduction, Batch Learning and On-Line Learning, The Back-Propagation Algorithm, XOR problem, Heuristics for Making the back-propagation Algorithm Perform Better, Back Propagation and Differentiation, The Hessian and Its Role in On-Line Learning, Optimal Annealing and Adaptive Control of the Learning Rate, Generalization, Approximations of Functions, Cross Validation, Complexity Regularization and Network Pruning, Virtues andLimitations of Back-Propagation Learning, Supervised Learning Viewed as Optimization Problem, Convolutional Networks, Nonlinear Filtering, Small-Scale Versus Large-Scale Learning Problems

Unit 6: Kernel Methods and Radial-Basis Function Networks (7 Hrs.)

Introduction, Cover‟s Theorem on the separability of Patterns, The Interpolation problem, Radial-Basis-Function Networks, K-Means Clustering, Recursive Least-Squares Estimation of the Weight Vector, Hybrid Learning Procedure for RBF Networks, Kernel Regression and Its Relation to RBF Networks

Unit 7: Self-Organizing Maps(6 Hrs.)

Introduction, Two Basic Feature-Mapping Models, Self-Organizing Map, Properties of the Feature Map, Contextual Maps, Hierarchical Vector Quantization, Kernel Self-Organizing Map, Relationship between Kernel SOM and Kullback-Leibler Divergence

Unit 8: Dynamic Driven Recurrent Networks (7 Hrs.)

Introduction, Recurrent Network Architectures, Universal Approximation Theorem, Controllability and Observability, Computational Power of Recurrent Networks, Learning Algorithms, Back Propagation through Time, Real-Time Recurrent Learning, Vanishing Gradients in Recurrent Networks, Supervised Training Framework for Recurrent Networks Using Non Sate Estimators, Adaptivity Considerations, CaseStudy: Model Reference Applied to Neurocontrol

Laboratory Works:

Practical should be focused on Single Layer Perceptron, Multilayer Perceptron, Supervised Learning, Unsupervised Learning, Recurrent Neural Network, Linear Prediction and Pattern Classification

Text Books:

  1. Simon Haykin, Neural Networks and Learning Machines, 3rdEdition, Pearson

Reference Books:

  1. Christopher M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press,2003
  2. Martin T. Hagan, Neural Network Design, 2ndEdition PWS pub co.