Pattern Recognition and Machine Learning II
Prerequisite for the lecture is the knowledge from the course Pattern Recognition and Machine Learning I.
This course (lecture/exercise) provides a basic introduction to an area that deals with the analysis of data, the recognition of regularities in these data and the formation of models from data. It continues the course Pattern Recognition and Machine Learning I.
In this lecture basic methods and procedures are discussed on the basis of a worldwide known standard textbook for pattern recognition and machine learning. The aim is to get to know them in detail in such a way that they can not only be applied in a particular manner, but also further developments of these techniques are possible. Among other things, the following topics will be discussed: Kernel functions and statistical learning theory (including Support vector machines), Bayesian networks and Markov random fields, Expectation maximization and Variational inference, Sampling methods, Continuous latent variables (including Principal component analysis, Kernel PCA, Autoassociators), Ensemble techniques.
In the exercise the application of the considered techniques are examined with the help of Jupyter notebooks and suitable Python libraries. Sample data records from different application fields are considered. The aim is to learn the safe, systematic and careful use of the mentioned techniques.
The course creates the prerequisites for further courses such as Laboratory Deep Learning or Lecture/Exercise Autonomous Learning.