Course detail

Fuzzy Systems and Neural Networks

FSI-RNFAcad. year: 2019/2020

The course provides students with the introduction to the most commonly used paradigms of neural networks. Further, it shows technically oriented applications of neural networks and their practical use. The theory part is focused on neural dynamics - mainly it's activation, signals and activation models, synapse dynamics - both supervised and unsupervised learning (competitive learning, back-propagation); network architectures. Furthermore, the neural and fuzzy representation of structured knowledge is compared and used for controllers design.

Language of instruction

Czech

Number of ECTS credits

7

Mode of study

Not applicable.

Learning outcomes of the course unit

Knowledge of neural network functioning and their adequate use for data processing, specially for “intelligent“ controllers design.

Prerequisites

The knowledge of matrix calculus is expected. The orientation in optimization methods and modelling is suggested.

Co-requisites

Not applicable.

Planned learning activities and teaching methods

The course is taught through lectures explaining the basic principles and theory of the discipline. Teaching is suplemented by practical laboratory work.

Assesment methods and criteria linked to learning outcomes

Course-unit credit requirements: active participation in seminars and individual elaboration of an assigned project. The examination comprises written and oral parts. The written part is represented by a test with four questions . Oral part consists of discussion on the written part with possible complementary questions. The evaluation is fully in the competence of a tutor according to the valid directives of BUT.

Course curriculum

Not applicable.

Work placements

Not applicable.

Aims

The course objective is for students to master basics of neural modelling and common use of neural models for data processing as well as the frame of “intelligent“ controllers design.

Specification of controlled education, way of implementation and compensation for absences

The attendance at lectures is recommended while at seminars it is obligatory. Education runs according to week schedules. The form of compensation of missed seminars is fully in the competence of a tutor.

Recommended optional programme components

Not applicable.

Prerequisites and corequisites

Not applicable.

Basic literature

Mařík, V: Umělá inteligence I, II, 1993
Rojas, R.: Neural Networks, 1996

Recommended reading

Not applicable.

Classification of course in study plans

  • Programme M2A-P Master's

    branch M-MET , 2 year of study, winter semester, compulsory

Type of course unit

 

Lecture

26 hod., optionally

Teacher / Lecturer

Syllabus

1. The connectionist model of animal brain, neural dynamic systems, common signal functions, additive neural dynamics.
2. Additive bivalent models, common neural activations, learning as coding.
3. Basic rules of unsupervised learning, stochastic unsupervised learning and stochastic equilibrium.
4. Supervised learning, learning as stochastic learning of patterns with known class membership.
5. Backpropagation algorithm.
6. Neural nets as stochastic gradient systems, synaptic convergence to centroids.
7. Global equilibrium: convergence and stability, global stability of recurrent neural nets, structural stability of unsupervised learning.
8. Fuzzy sets and systems, uncertainty in probabilistic environment, randomness versus ambiguity.
9. Fuzzy and neural approximation of functions, neural representation and fuzzy representation of structured knowledge.
10. Controllers based on mathematical model and on approximator.
11. Fuzzy controllers.
12. Controllers based on Kalman filter.
13. Conclusions.

Laboratory exercise

26 hod., compulsory

Teacher / Lecturer

Syllabus

1. Basics of Matlab, basic statements, work with matrices, visualization functions.
2. Perceptron I: neuron model, activation function, network generation.
3. Perceptron II: perceptron limitations, XOR problem.
4. Linear nets I: neuron model, activation functions, network generation.
5. Linear nets II: automatic weights adjustment, learning.
6. Backpropagation I: neuron model, topology, learning algorithm, project specification.
7. Backpropagation II: algorithm limitations, momentum, adaptive leaning parameters.
8. Backpropagation III: Levenberg - Marquardt learning rule
9. Radial basis network: neuron model, network generation, assessment of input neurons number
10. Recurrent networks: Hopfield model.
11. Recurrent networks: Elman model.
12. Works on project, project consultation.
13. Accreditation.