Course detail

Convolutional Neural Networks

FIT-KNNAcad. year: 2025/2026

Solutions based on machine learning approaches gradually replace more and more hand-designed solutions in many areas of software development, especially in perceptual task focused on information extraction from unstructured sources like cameras and microphones. Today, the dominant method in machine learning is neural networks and their convolutional variants. These approaches are at the core of many commercially successful applications and they push forward the frontiers of artificial intelligence.

Language of instruction

Czech

Number of ECTS credits

5

Mode of study

Not applicable.

Entry knowledge

Basics of linear algebra (multiplication of vectors and matrices), differential calculus (partial derivatives, chain rule), Python and intuitive understanding of probability (e.g. conditional probability). Any knowledge of machine learning and image processing is an advantage.

Rules for evaluation and completion of the course

  • Project concluded by public presentation - 65 points.
  • Two tests during the semester - 35 points.

Aims

Basic knowledge of convolutional neural networks, their capabilities and limitations. Practical applications mostly in computer vision tasks and completed by task from speech recognition and language processing. To allow students to design overall solutions using convolutional networks in practical applications including network architectures, optimization, data collection and testing and evaluation.
Students will gain basic knowledge of convolutional neural networks, their training (optimization), their building blocks and of the tools and software frameworks used to implement them. Students will gain insights in what factors determine accuracy of networks in real applications including data sets, loss functions, network structure, regularization, optimization, overfitting and multi-task learning. They will receive an overview of state-of-the-art networks in a range of computer vision tasks (classification, object detection, segmentation, identification), speech recognition, language understanding, data generation and reinforcement learning.
Students will acquire team work experience during project work and they will acquire basic knowledge of python libraries for linear algebra and machine learning.

Study aids

Not applicable.

Prerequisites and corequisites

Not applicable.

Basic literature

Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT Press, 2016. (EN)

Recommended reading

Bishop, C. M.: Pattern Recognition, Springer Science + Business Media, LLC, 2006, ISBN 0-387-31073-8.
Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT Press, 2016.
Li, Fei-Fei, et al.: CS231n: Convolutional Neural Networks for Visual Recognition. Stanford, 2018.

Classification of course in study plans

  • Programme MITAI Master's

    specialization NSEC , 0 year of study, summer semester, elective
    specialization NISY up to 2020/21 , 0 year of study, summer semester, elective
    specialization NNET , 0 year of study, summer semester, elective
    specialization NMAL , 0 year of study, summer semester, compulsory
    specialization NCPS , 0 year of study, summer semester, elective
    specialization NHPC , 0 year of study, summer semester, elective
    specialization NVER , 0 year of study, summer semester, elective
    specialization NIDE , 0 year of study, summer semester, elective
    specialization NISY , 0 year of study, summer semester, elective
    specialization NEMB , 0 year of study, summer semester, elective
    specialization NSPE , 0 year of study, summer semester, compulsory
    specialization NEMB , 0 year of study, summer semester, elective
    specialization NBIO , 0 year of study, summer semester, compulsory
    specialization NSEN , 0 year of study, summer semester, elective
    specialization NVIZ , 0 year of study, summer semester, compulsory
    specialization NGRI , 0 year of study, summer semester, elective
    specialization NADE , 0 year of study, summer semester, elective
    specialization NISD , 0 year of study, summer semester, elective
    specialization NMAT , 0 year of study, summer semester, elective

Type of course unit

 

Lecture

26 hod., optionally

Teacher / Lecturer

Syllabus

  1. Introduction, linear models. loss function, learning algorithms and evaluation. (organization, NN review)
  2. Fully connected networks, loss functions for classification and regression. (prezentation)
  3. Convolutional networks, locality in equivariance of computation, weight initialization, batch normalization. (prezentation, weight init. tutorial)
  4. Network architectures for image classification. (prezentation)
  5. Generalization, regularization, data augmentation. multi-task learning, semi supervised learning, active learning, self-supervised learning. (prezentation)
  6. Object detection: MTCNN face detector, R-CNN, Fast R-CNN, Faster R-CNN, YOLO, SSD. (prezentation including image segmentation)
  7. Semantic and instance segmentation. Connections to estimation of depth, surface normals, shading and motion.
  8. Learning similarity and embedding. Person identification. (prezentation)
  9. Recurrent networks and sequence processing (text and speech). Connectionist Temporal Classification (CTC). Attention networks. (prezentation)
  10. Language models. Basic image captioning networks, question answering and language translation. (prezentation)
  11. Generative models. Autoregressive factorization. Generative Adversarial Networks (GAN, DCGAN, cycle GAN). (prezentation)
  12. Reinforcement learning. Deep Q-network (DQN) and policy gradients. (prezentation)
  13. Overview of emerging applications and cutting edge research.

Project

26 hod., compulsory

Teacher / Lecturer

Syllabus

Team project (2-3 students).
Individual assignments - proposed by students, approved by the teacher. Components:
  • Problem Formulation, team formation.
  • Research of existing solutions and usefull tools.
  • Baseline solution and evaluation proposal.
  • Data collection.
  • Experiments, testing and gradual improvement.
  • Final report and Presentation of the project.