Course detail
Stochastic Processes
FSI-SSPAcad. year: 2018/2019
The course provides the introduction to the theory of stochastic processes. The following topics are dealt with: types and basic characteristics, covariation function, spectral density, stationarity, examples of typical processes, time series and evaluating, parametric and nonparametric methods, identification of periodical components, ARMA processes. Applications of methods for elaboration of project time series evaluation and prediction supported by the computational system MATLAB.
Language of instruction
Number of ECTS credits
Mode of study
Guarantor
Department
Learning outcomes of the course unit
Prerequisites
Co-requisites
Planned learning activities and teaching methods
Assesment methods and criteria linked to learning outcomes
Course curriculum
Work placements
Aims
Specification of controlled education, way of implementation and compensation for absences
Recommended optional programme components
Prerequisites and corequisites
Basic literature
Recommended reading
Ljung, L. System Identification-Theory For the User. 2nd ed. PTR Prentice Hall : Upper Saddle River, 1999. (EN)
Classification of course in study plans
- Programme M2A-P Master's
branch M-MAI , 1 year of study, summer semester, compulsory
- Programme IT-MSC-2 Master's
branch MMI , 0 year of study, summer semester, elective
branch MBI , 0 year of study, summer semester, elective
branch MSK , 0 year of study, summer semester, elective
branch MMM , 0 year of study, summer semester, compulsory-optional
branch MBS , 0 year of study, summer semester, elective
branch MPV , 0 year of study, summer semester, elective
Type of course unit
Lecture
Teacher / Lecturer
Syllabus
2. Consistent system of distribution functions, strict and weak stacionarity.
3. Moment characteristics: mean and autocorrelation function.
4. Spectral density function (properties).
5. Decomposition model (additive, multiplicative), variance stabilization.
6. Identification of periodic components: periodogram, periodicity tests.
7. Methods of periodic components separation.
8. Methods of trend estimation: polynomial regression, linear filters, splines.
9. Tests of randomness.
10.Best linear prediction, Yule-Walker system of equations, prediction error.
11.Partial autocorrelation function, Durbin-Levinson and Innovations algorithm.
12.Linear systems and convolution, causality, stability, response.
13.ARMA processes and their special cases (AR and MA process).
Computer-assisted exercise
Teacher / Lecturer
Syllabus
2. Simulating time series with some typical autocorrelation functions: white noise, coloured noise with correlations at lag one, exhibiting linear trend and/or periodicities.
3. Detecting heteroscedasticity. Transformations stabilizing variance (power and Box-Cox transform).
4. Identification of periodic components, periodogram, and testing.
5. Use of linear regression model on time series decomposition.
6. Estimation of polynomial degree for trend and separation of periodic components.
7. Denoising by means of linear filtration (moving average): design of optimal weights preserving polynomials up to a given degree, Spencer's 15-point moving average.
8. Filtering by means of stepwise polynomial regression.
9. Filtering by means of exponential smoothing.
10.Randomness tests.
11.Simulation, identification, parameters estimate and verification for ARMA model.
12.Testing significance of (partial) correlations.
13.Tutorials on student projects.