|
Week - 1 |
Course introduction, history of artificial neural networks, and their relation to biological neural systems. |
|
Week - 2 |
Artificial neuron model, input-output relationships, and basic activation functions (sigmoid, tanh, ReLU). |
|
Week - 3 |
Structure of single-layer and multi-layer perceptrons (MLP) and introduction to feedforward networks. |
|
Week - 4 |
Forward propagation and backpropagation algorithms, and loss functions. |
|
Week - 5 |
Gradient descent and optimization techniques; learning rate and epoch concepts. |
|
Week - 6 |
Overfitting and regularization techniques, dropout, L1/L2 regularization. |
|
Week - 7 |
Supervised learning applications: classification and regression problems. |
|
Week - 8 |
Unsupervised learning and Kohonen networks (self-organizing maps). |
|
Week - 9 |
Introduction to recurrent neural networks (RNN), applications in time series and sequences. |
|
Week - 10 |
Hopfield networks and the basic principles of energy-based models. |
|
Week - 11 |
Introduction to Boltzmann machines and restricted Boltzmann machines (RBM). |
|
Week - 12 |
Fundamentals of convolutional neural networks (CNN) and applications in image processing. |
|
Week - 13 |
Comparison of network architectures, advantages and disadvantages; real-world data applications. |
|
Week - 14 |
Presentation of term projects, overall evaluation, and brief look at advanced topics. |