-
Notifications
You must be signed in to change notification settings - Fork 30
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Signed-off-by: zeramorphic <[email protected]>
- Loading branch information
1 parent
08201a6
commit c20de33
Showing
12 changed files
with
84 additions
and
23 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,18 @@ | ||
\chapter[Analysis and Topology \\ \textnormal{\emph{Lectured in Michaelmas \oldstylenums{2021} by \textsc{Dr.\ V.\ Zs\'ak}}}]{Analysis and Topology} | ||
\emph{\Large Lectured in Michaelmas \oldstylenums{2021} by \textsc{Dr.\ V.\ Zs\'ak}} | ||
|
||
[[INTRODUCTION]] | ||
In the analysis part of the course, we continue the study of convergence from Analysis I. | ||
We define a stronger version of convergence, called uniform convergence, and show that it has some very desirable properties. | ||
For example, if integrable functions \( f_n \) converge uniformly to the integrable function \( f \), then the integrals of the \( f_n \) converge to the integral of \( f \). | ||
The same cannot be said in general about non-uniform convergence. | ||
We also extend our study of differentiation to functions with multiple input and output variables, and rigorously define the derivative in this higher-dimensional context. | ||
|
||
In the topology part of the course, we consider familiar spaces such as \( [a,b], \mathbb C, \mathbb R^n \), and generalise their properties. | ||
We arrive at the definition of a metric space, which encapsulates all of the information about how near or far points are from others. | ||
From here, we can define notions such as continuous functions between metric spaces in such a way that does not depend on the underlying space. | ||
|
||
We then generalise even further to define topological spaces. | ||
The only information a topological space contains is the neighbourhoods of each point, but it turns out that this is still enough to define continuous functions and similar things. | ||
We study topological spaces in an abstract setting, and prove important facts that are used in many later courses. | ||
|
||
\subfile{../../ib/antop/main.tex} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,12 @@ | ||
\chapter[Complex Analysis \\ \textnormal{\emph{Lectured in Lent \oldstylenums{2022} by \textsc{Prof.\ N.\ Wickramasekera}}}]{Complex Analysis} | ||
\emph{\Large Lectured in Lent \oldstylenums{2022} by \textsc{Prof.\ N.\ Wickramasekera}} | ||
|
||
[[INTRODUCTION]] | ||
Complex differentiation is a stronger notion than real differentiation. | ||
Many functions that are differentiable as a function of two real variables are not complex differentiable, for example the complex conjugate function. | ||
This stronger notion allows us to prove some surprising results. | ||
It turns out that if a function is complex differentiable once in a neighbourhood of a point, then it is given by a convergent power series in some neighbourhood of that point. | ||
|
||
Another interesting result is Cauchy's integral formula: if a function is complex differentiable in a neighbourhood around a point, one can evaluate the function at that point using a certain integral over any loop around that point. | ||
A similar result can be used to obtain an arbitrary derivative of a function at a point by using a single integral. | ||
|
||
\subfile{../../ib/ca/main.tex} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,10 @@ | ||
\chapter[Geometry \\ \textnormal{\emph{Lectured in Lent \oldstylenums{2022} by \textsc{Prof.\ I.\ Smith}}}]{Geometry} | ||
\emph{\Large Lectured in Lent \oldstylenums{2022} by \textsc{Prof.\ I.\ Smith}} | ||
|
||
[[INTRODUCTION]] | ||
This course serves as an introduction to the modern study of surfaces in geometry. | ||
A surface is a topological space that locally looks like the plane. | ||
The notions of length and area on a surface are governed by mathematical objects called the fundamental forms of the surface at particular points. | ||
We can use integrals to work out exact lengths and areas. | ||
We study various spaces, including spaces of constant curvature, such as the plane, spheres, and hyperbolic space. | ||
|
||
\subfile{../../ib/geom/main.tex} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,14 @@ | ||
\chapter[Groups, Rings and Modules \\ \textnormal{\emph{Lectured in Lent \oldstylenums{2022} by \textsc{Dr.\ R.\ Zhou}}}]{Groups, Rings and Modules} | ||
\emph{\Large Lectured in Lent \oldstylenums{2022} by \textsc{Dr.\ R.\ Zhou}} | ||
|
||
[[INTRODUCTION]] | ||
A ring is a an algebraic structure with an addition and multiplication operation. | ||
Common examples of rings include \( \mathbb Z, \mathbb Q, \mathbb R, \mathbb C \), the Gaussian integers \( \mathbb Z[i] = \qty{a + bi \mid a, b \in \mathbb Z} \), the quotient \( \faktor{\mathbb Z}{n\mathbb Z} \), and the set of polynomials with complex coefficients. | ||
We can study factorisation in a general ring, generalising the idea of factorising integers or polynomials. | ||
Certain rings, called unique factorisation domains, have the property like the integers that every nonzero non-invertible element can be expressed as a unique product of irreducibles (in \( \mathbb Z \), the irreducibles are the prime numbers). | ||
This property, and many others, are studied in this course. | ||
|
||
Modules are like vector spaces, but instead of being defined over a field, they are defined over an arbitrary ring. | ||
In particular, every vector space is a module, because every field is a ring. | ||
We use the theory built up over the course to prove that every \( n \)-dimensional complex matrix can be written in Jordan normal form. | ||
|
||
\subfile{../../ib/grm/main.tex} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,11 @@ | ||
\chapter[Linear Algebra \\ \textnormal{\emph{Lectured in Michaelmas \oldstylenums{2021} by \textsc{Prof.\ P.\ Raphael}}}]{Linear Algebra} | ||
\emph{\Large Lectured in Michaelmas \oldstylenums{2021} by \textsc{Prof.\ P.\ Raphael}} | ||
|
||
[[INTRODUCTION]] | ||
Linear algebra is the field of study that deals with vector spaces and linear maps. | ||
A vector space can be thought of as a generalisation of \( \mathbb R^n \) or \( \mathbb C^n \), although they can be based off any field (not just \( \mathbb R \) or \( \mathbb C \)), and may have infinitely many dimensions. | ||
In this course, we mainly study finite-dimensional vector spaces and the linear functions between them. | ||
Any linear map between finite-dimensional vector spaces can be encoded as a matrix. | ||
Such maps have properties such as their trace and determinant, which can be easily obtained from a matrix representing them. | ||
As was shown for real matrices in Vectors and Matrices, if the determinant of a matrix is nonzero it can be inverted. | ||
|
||
\subfile{../../ib/linalg/main.tex} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,11 @@ | ||
\chapter[Markov Chains \\ \textnormal{\emph{Lectured in Michaelmas \oldstylenums{2021} by \textsc{Dr.\ P.\ Sousi}}}]{Markov Chains} | ||
\emph{\Large Lectured in Michaelmas \oldstylenums{2021} by \textsc{Dr.\ P.\ Sousi}} | ||
|
||
[[INTRODUCTION]] | ||
A Markov chain is a common type of random process, where each state in the process depends only on the previous one. | ||
Due to their simplicity, Markov processes show up in many areas of probability theory and have lots of real-world applications, for example in computer science. | ||
|
||
One example of a Markov chain is a simple random walk, where a particle moves around an infinite lattice of points, choosing its next direction to move at random. | ||
It turns out that if the lattice is one- or two-dimensional, the particle will return to its starting point infinitely many times, with probability 1. | ||
However, if the lattice is three-dimensional or higher, the particle has probability 0 of ever returning to its starting point. | ||
|
||
\subfile{../../ib/markov/main.tex} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,14 @@ | ||
\chapter[Methods \\ \textnormal{\emph{Lectured in Michaelmas \oldstylenums{2021} by \textsc{Prof.\ E.\ P.\ Shellard}}}]{Methods} | ||
\emph{\Large Lectured in Michaelmas \oldstylenums{2021} by \textsc{Prof.\ E.\ P.\ Shellard}} | ||
|
||
[[INTRODUCTION]] | ||
In this course, we discuss various methods for solving differential equations. | ||
Different forms of differential equations need different solution strategies, and we study a wide range of common types of differential equation. | ||
|
||
A particularly powerful method for solving differential equations involves the use of Green's functions. | ||
For example, physical systems can involve bodies spread over space with constant density. | ||
Green's functions allow the equation to be solved for a point mass, and then integrated to find the solution for the larger body. | ||
|
||
Fourier transforms are another way to solve differential equations. | ||
Sometimes a differential equation is easier to solve after applying the Fourier transform to the relevant function, then the inverse Fourier transform recovers the solution to the original equation. | ||
|
||
\subfile{../../ib/methods/main.tex} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,9 @@ | ||
\chapter[Quantum Mechanics \\ \textnormal{\emph{Lectured in Michaelmas \oldstylenums{2021} by \textsc{Dr.\ M.\ Ubiali}}}]{Quantum Mechanics} | ||
\emph{\Large Lectured in Michaelmas \oldstylenums{2021} by \textsc{Dr.\ M.\ Ubiali}} | ||
|
||
[[INTRODUCTION]] | ||
In this course, we explore the basics of quantum mechanics using the Schr\"odinger equation. | ||
This equation explains how a quantum wavefunction changes over time. | ||
By solving the Schr\"odinger equation with different inputs and boundary conditions, we can understand some of the ways in which quantum mechanics differs from classical physics, explaining some of the scientific discoveries of the past century. | ||
We prove some theoretical facts about quantum operators and observables, such as the uncertainty theorem, which roughly states that it is impossible to know both the position and momentum of a particle. | ||
|
||
\subfile{../../ib/quantum/main.tex} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,12 @@ | ||
\chapter[Statistics \\ \textnormal{\emph{Lectured in Lent \oldstylenums{2022} by \textsc{Dr.\ S.\ Bacallado}}}]{Statistics} | ||
\emph{\Large Lectured in Lent \oldstylenums{2022} by \textsc{Dr.\ S.\ Bacallado}} | ||
|
||
[[INTRODUCTION]] | ||
An estimator is a random variable that approximates a parameter. | ||
For instance, the parameter could be the mean of a normal distribution, and the estimator could be a sample mean. | ||
In this course, we study how estimators behave, what properties they have, and how we can use them to make conclusions about the real parameters. | ||
This is called parametric inference: the study of inferring parameters from statistics of sample data. | ||
|
||
Towards the end of the course, we study the normal linear model, which is a useful way to model data that is believed to depend linearly on a vector of inputs, together with some normally distributed noise. | ||
Even nonlinear patterns can be analysed using this model, by letting the inputs to the model be polynomials in the real-world data. | ||
|
||
\subfile{../../ib/stats/main.tex} |