Markus Schweighofer - Aktuelle Lehrveranstaltungen
WS 2025/2026 Linear Algebra in Data Science
9 ECTS credits, 4V+2Ü+P (P stands for programming projects in Julia)
Information is to follow soon. The lecture is divided into two parts. It is based on the recently appeared book Linear Algebra, Data Science and Machine Learning by Jeff Calder and Peter Olver. We will assume familiarity with the first four chapters and parts of Chapter 5 of that book. We currently plan to cover Section 5.7, Chapters 7 and 8, Sections 9.1 to 9.7, Section 9.10 and Sections 10.1 to 10.6 of the book.
A rough plan could be as follows:
PART 1
Chapter 1: Machine Learning and Data
01, Oct 22 Review of Linear Algebra and the Julia Programming Language
02, Oct 24 Basics of Machine Learning and Data
03, Oct 29 Linear Regression
04, Oct 31 Support Vector Machines
05, Nov 05 Nearest Neighbor Classification
06, Nov 07 Means Clustering
07, Nov 12 Kernel Methods
Chapter 2: Singular Values and Principal Component Analysis
08, Nov 14 The Singular Value Decomposition
09, Nov 19 The Principal Components
10, Nov 21 The Best Approximating Subspace
11, Nov 26 PCA-based Compression
12, Nov 28 PCA-based Compressive Sensing
13, Dec 03 Linear Discriminant Analysis
14, Dec 05 Multidimensional Scaling
PART II
Chapter 3: Graph Theory and Graph-based learning
15, Dec 10 Graphs and Digraphs
16, Dec 12 The Incidence Matrix
17, Dec 17 The Graph Laplacian
18, Dec 19 Binary Spectral Clustering
19, Jan 07 Distances of Graphs
20, Jan 09 Diffusion in Graphs and Digraphs
21, Jan 14 Diffusion Maps and Spectral Embeddings
22, Jan 16 The Discrete Fourier Transform
Chapter 4: Neural Networks and Deep Learning
23, Jan 21 Neural Networks and Deep Learning
24, Jan 23 Fully Connected Networks
25, Jan 28 Backpropagation and Automatic Differentiation