Blockkurs in der vorlesungsfreien Zeit, in dem die Grundlagen des
Textsatzsystems LaTeX und speziell die Erstellung mathematischer Texte
behandelt werden. Nähere Informationen zu Ablauf und Anmeldung auf der Webseite des Kurses.
- المعلم: Mechtild Callies
- المعلم: Julia Karczewski
- المعلم: Katharina Oberpriller
Enrolment key: aml25
Credits: 6 ECTS (2SWS lecture + 2SWS exercise)
Modules:
- MSc FiMa: WP23 „Advanced Topics in Computer and Data Science B”
- MSc Math: WP42 „Überblick über ein aktuelles Forschungsgebiet B”
Description:
Real-world applications of machine learning require not only a strong theoretical foundation but also a solid knowledge of the methodologies, tools, and heuristics essential for implementing machine learning algorithms. However, the practical aspects of machine learning are often overlooked in mathematics programs. This course bridges that gap by providing students with hands-on experience in implementation and empirical analysis of machine learning algorithms — critical skills for those pursuing careers in data analysis or machine learning.
Content:
The course covers fundamental topics such as linear regression, gradient descent, regularization techniques, logistic regression, support vector machines (SVMs), and basic neural networks. Additionally, the course will explore advanced optimization methods, multi-class classification strategies, and ensemble learning techniques such as boosting and bagging.A key component of the course is extensive programming in Python, using libraries such as NumPy, Matplotlib, Pandas, and scikit-learn. We will work with real datasets, including MNIST handwritten digits, the Boston Housing dataset, Wine dataset, etc.
- المعلم: Mariia Seleznova
Enrolment key: omsose25
Credits: 6 ECTS
Format: 4 hours lecture, 2 hours exercise
Target audience: MSc FiMa & Math
Modules:
- MSc FiMa: TBA
- MSc Math: TBA
Description:
Optimization is the doctrine for finding the "best" alternative between a set of possible options in terms of a given objective function. The course is devoted to the study of the most widely used optimization methods and their convergence analysis. Throughout the lecture, the students will learn how to select the most suited optimization method for a given problem and to evaluate the expected rate of convergence of the algorithm in that specific scenario. The focus will be continuous optimization, meaning that we will consider problems with continuous variables living in a continuous vector space.
Content:
- basics of optimization;
- first order methods (gradient descent, conjugate gradient, Barzilai-Borwein and Polyak step);
- line search methods (Armijo, nonmonotone);
- second order methods (Newton, Quasi-Newton, Trust-Region);
- constrained optimization (projected gradient method, KKT conditions).
- المعلم: Arinze Folarin
- المعلم: Leonardo Galli
- المعلم: Garam Kim
Password for signing up on the
Webpage!
Signing up starts 14. April 2025.
- المعلم: Maximilian Duell
- المعلم: Thomas Sørensen