Mathematics for Computer Science and Philosophy: 2019-2020
This is a second/third-year course for Computer Science & Philosophy students. They will attend a subset of the first-year lectures in Linear Algebra and Continuous Mathematics, and will so be equipped with the knowledge needed for courses in Machine Learning.
This course is part of both the Preliminary Examination for Computer Science students and the Final Honour School for Computer Science and Philosophy students. Students should note that the questions set on this course in the Final Honour School in Computer Science and Philosophy will be more challenging than those that are set for the Preliminary Examination in Computer Science, and should bear this in mind when attempting sample exam questions and past exam questions.
Classes: Colleges will arrange problem classes for their students, as with the corresponding first-year courses: three classes for Linear Algebra and two for Continuous Mathematics.
Examinations: A two-hour exam at the end of the third year, with students answering two questions from a choice of three.
Lectures 1-20 cover the syllabus for the Preliminary Examination in Computer Science.
Lectures 1-17 cover the syllabus for the Final Honour School in Computer Science and Philosophy.
Lectures 1-2 Vectors: Vectors and geometry in two and three space dimensions. Algebraic properties. Dot products and the norm of a vector. Important inequalities. Vector spaces, subspaces and vector space axioms. Complex vector spaces.
Lectures 3-5 Independence and orthogonality: Linear independence of vectors. Basis and dimension of a vector space. Orthogonal vectors and subspaces. The Gram-Schmidt orthogonalisation.
Lectures 6-8 Matrices: Matrix operations. Column, row and null space. Rank of a matrix. Inverse and transpose. Elementary matrices. The Gauss-Jordan method.
Lectures 9-11 Systems of linear equations: Examples of linear systems. Geometry of linear equations. Gaussian elimination. Row echelon form. Homogeneous and nonhomogeneous systems of linear equations. Application to the intersection of lines and planes.
- Lectures 12-14 Elementary matrix factorisations and determinants: LU factorisation, related algorithms and operation count. PLU factorisation. Calculating the determinant of a matrix. Properties of the determinant of a matrix. Application examples: area, volume and cross product.
- Lectures 15-17 Eigenvalues and eigenvectors: Definition. Similarity and diagonalisation.
Lectures 18-20 Linear transformations: Definition and examples. Properties and composition of linear transformations. Rotations, reflections and stretches. Translations using homogeneous coordinates. One-to-one and onto transformations.
*[3 lectures] Derivatives, partial derivatives, differentiation with respect to a vector, gradient, Hessian and Jacobian. Taylor's theorem in 1 dimension (Lagrange remainder), Taylor's theorem in n dimensions (remainder only briefly). Examples.
*[3 lectures] Optimization in 1 and n dimensions. Classification of turning points via Taylor's theorem. Convexity. Constrained optimization: Lagrange multipliers. Examples.
[2-3 lectures] Numerical integration in 1 dimension: midpoint and Simpson’s rules, complexity and error analysis. Briefly, integration in n dimensions and Monte Carlo methods. Examples.
*[1 lecture] Floating-point numbers and rules of thumb for accuracy in practice. Convergence rates of iterative methods.
*[2-3 lectures] Numerical root finding in 1 dimension by bisection, Newton's method, secant method. Root finding in n dimensions by Newton's method, and briefly quasi-Newton methods. Complexity and error analysis. Examples.
[2-3 lectures] Numerical optimization in 1 and n dimensions: root finding for gradient and gradient descent methods. Complexity and error analysis. Examples.
[1-2 lectures] Applications.
Sections marked * cover the syllabus for the Continuous Maths part of Mathematics for Computer Science and Philosophy.
Vector spaces and subspaces. Matrices. Inverse matrices. Solution of linear systems. Elementary matrix factorisations. Eigenvalues and eigenvectors.
Derivatives, partial derivatives, differentiation with respect to a vector; gradient, Hessian, Jacobian. Taylor’s theorem in 1- and n-dimensions, with Lagrange remainder. Optimization in 1- and n-dimensions, classification of stationary points. Constrained optimization and the method of Lagrange multipliers. Iterative numerical methods and rates of convergence. Methods for numerical root finding in 1- and n-dimensions, complexity and (simple cases of) error analysis.
Linear Algebra: Introduction to Linear Algebra, Gilbert Strang, Wellesley-Cambridge press.
Continuous Maths: Please see the course materials page for Continuous Maths.