Skip to content

Commit

Permalink
Merge pull request #22 from aai-institute/docs/clean-up
Browse files Browse the repository at this point in the history
Docs/clean up
  • Loading branch information
samuelburbulla committed Jan 23, 2024
2 parents 19a28f2 + a122869 commit 0d9bea5
Show file tree
Hide file tree
Showing 14 changed files with 149 additions and 59 deletions.
8 changes: 6 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,9 @@

## Unreleased

- Set up minimal project structure.
- Create first notebook.
- Set up project structure.
- Implement basic functionality.
- Build documentation.
- Create first notebooks.
- Introduce neural operators.
- Add CI/CD.
19 changes: 12 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,10 @@ Learning function operators with neural networks.
</a>
</div>

**Continuity** is a Python package with a general interface for machine
learning on functions operators. It implements various neural network
architectures, including DeepONets or neural operators, physics-informed loss
functions to train the networks based on PDEs, and a variety of benchmarks.

**Continuity** is a Python package for machine learning on function operators.
It implements various neural operator architectures (e.g., DeepONets),
physics-informed loss functions to train based on PDEs, and a collection of
examples and benchmarks.

## Installation
Clone the repository and install the package using pip.
Expand All @@ -32,8 +31,14 @@ pip install -e .
```

## Usage
Some examples can be found in the `notebooks` and `tests` directories. See the
[documentation](https://aai-institute.github.io/Continuity/) for more details.
Our [Documentation](https://aai-institute.github.io/Continuity/) contains a verbose introduction to operator learning, a collection of examples using Continuity, and a class documentation.

In general, the operator syntax in Continuity is
```python
v = operator(x, u(x), y)
```
mapping a function `u` (evaluated at `x`) to function `v` (evaluated in `y`).
For more details, see [Learning Operators](https://aai-institute.github.io/Continuity/operators/index.html).

## Contributing
If you find a bug or have a feature request, please open an issue on GitHub. If
Expand Down
28 changes: 28 additions & 0 deletions docs/examples/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
---
title: Examples
---

This is a collection of notebooks that showcase various applications of
Continuity.

::cards:: cols=2

- title: The Basics
content: Learning function operators with Continuity
url: basics

- title: Physics-informed
content: >
Training physics-informed neural operators
url: physicsinformed

- title: Self-supervised
content: >
Self-supervised training of operators
url: selfsupervised

- title: Super-resolution
content: Neural operators for super-resolution
url: superresolution

::/cards::
17 changes: 11 additions & 6 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@ title: Home
</div>


**Continuity** is a Python package for machine
learning on functions operators. It implements various neural network
architectures, including DeepONets or neural operators, physics-informed loss
functions to train the networks based on PDEs, and a variety of benchmarks.
**Continuity** is a Python package for machine learning on function operators.
It implements various neural operator architectures (e.g., DeepONets),
physics-informed loss functions to train based on PDEs, and a collection of
examples and benchmarks.

::cards:: cols=2

Expand All @@ -26,11 +26,16 @@ functions to train the networks based on PDEs, and a variety of benchmarks.

- title: Learning Operators
content: >
Basics of learning function operators with neural networks
Basics of learning function operators
url: operators/index.md

- title: Examples
content: >
Some notebooks using Continuity
url: examples/index.md

- title: Browse the API
content: Full documentation of the API
content: Full class documentation
url: api/continuity/index.md

::/cards::
17 changes: 9 additions & 8 deletions docs/operators/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,8 @@ transfer the concept of function mapping into machine learning.

## Operators

In mathematics, _operators_ are function mappings – they map functions to functions.
In mathematics, _operators_ are function mappings: they map functions to
functions.

Let $u: X \subset \mathbb{R}^d \to \mathbb{R}^c$ be a function that maps a
$d$-dimensional input to $c$ output *channels*.
Expand All @@ -31,10 +32,10 @@ maps $u$ to a function $v: Y \subset \mathbb{R}^{p} \to \mathbb{R}^{q}$.

## Learning Operators

Learning operators is the task of learning the mapping $G$ from data.
In the context of neural networks, we want to learn a neural network $G_\theta$
Operator learning is the task of learning the mapping $G$ from data.
In the context of neural networks, we want to train a neural network $G_\theta$
with parameters $\theta$ that, given a set of input-output pairs $(u_k, v_k)$,
maps $u_k$ to $v_k$. We refer to such a neural network as **neural operator**.
maps $u_k$ to $v_k$. We refer to such an architecture as **neural operator**.

In **Continuity**, we use the general approach of mapping function
evaluations to represent both input and output functions $u$ and $v$.
Expand All @@ -49,14 +50,14 @@ evaluations to represent both input and output functions $u$ and $v$.
neural operator architectures.

Let $x_i \in X,\ 1 \leq i \leq n,$ be a finite set of *collocation points*
(or *sensor positions*) in the domain $X$ of $u$.
(or *sensor positions*) in the input domain $X$ of $u$.
We represent the function $u$ by its evaluations at these collocation
points and write $\mathbf{x} = (x_i)_i$ and $\mathbf{u} = (u(x_i))_i$.
This finite dimensional representation is fed into the neural operator.

The mapped function $v = G(u)$, on the other hand, is also represented by
function evaluations only. Let $y_j \in Y,\ 1 \leq j \leq m,$ be a set of
*evaluation points* (or *query points*) in the domain $Y$ of $v$ and
*evaluation points* (or *query points*) in the input domain $Y$ of $v$ and
$\mathbf{y} = (y_j)_j$.
Then, the output values $\mathbf{v} = (v(y_j))_j$ are approximated by the neural
operator
Expand All @@ -81,8 +82,8 @@ G = lambda y: lambda u: operator(x, u, y)
v = G(u)(y)
```

Operators extend the concept of neural networks to function mappings, which
Neural operators extend the concept of neural networks to function mappings, which
enables discretization-invariant and mesh-free mappings of data with
applications to physics-informed training, super-resolution, and more.

See our examples in [[operators]] for more details and further reading.
See our <a href="../examples">Examples</a> for more details and further reading.
20 changes: 3 additions & 17 deletions notebooks/basics.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@
"A neural operator takes an input function $u$, evaluated at collocation points $x$,\n",
"and maps it to a function $v$ evaluated at (different) evaluation positions $y$:\n",
"$$\n",
"v(y) = G(u)(y) \\approx \\operatorname{NeuralOperator}\\left(x, u(x), y\\right).\n",
"v(y) = G(u)(y) \\approx G_\\theta\\left(x, u(x), y\\right).\n",
"$$\n",
"In this example, we choose the DeepONet architecture with 32 sensors as neural\n",
"operator."
Expand Down Expand Up @@ -290,29 +290,15 @@
"metadata": {},
"source": [
"As you can see, the operator output (approximately) matches $v$, as desired.\n",
"That's the basics!\n",
"Note that we can also wrap a neural operator to match the mathematical notation."
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
"# Wrap neural operator (with fixed x)\n",
"G = lambda u: lambda y: operator(x, u(x), y)\n",
"\n",
"# Call operator with (callable) u and obtain callable\n",
"v_y = G(u)(y)"
"That's the basics!"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In the other examples, we explore advanced features such as \n",
"physics-informed training, self-supervised training, super-resolution etc."
"physics-informed training, self-supervised training, or super-resolution."
]
}
],
Expand Down
11 changes: 7 additions & 4 deletions notebooks/selfsupervised.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,15 +58,18 @@
"\n",
"Create a data set of sine waves: The `Sine` dataset generates $N$ sine waves\n",
"$$\n",
"f(x) = \\sin(w_k x), \\quad w_k = 1 + \\frac{k}{N-1}, \\quad k = 0, \\dots, N-1.\n",
"f(x) = \\sin(w_k x), \\quad w_k = 1 + \\frac{k}{N-1},\n",
"$$\n",
"As a `SelfSupervisedDataset` it exports batches of samples for self-supervised\n",
"training, namely\n",
"$$\n",
"\\quad k = 0, \\dots, N-1.\n",
"$$\n",
"We wrap the `Sine` dataset by a `SelfSupervisedDataset` that exports batches\n",
"of samples for self-supervised training, namely\n",
"$$\n",
"\\left(\\mathbf{x}, f(\\mathbf{x}), x_j, f(x_j)\\right), \\quad \\text{for } j = 1, \\dots, M,\n",
"$$\n",
"where $\\mathbf{x} = (x_i)_{i=1 \\dots M}$ are the $M$ equidistantly\n",
"distributed sensor positions. "
"distributed sensor positions."
]
},
{
Expand Down
27 changes: 26 additions & 1 deletion src/continuity/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +1,26 @@
"""The Continuity package."""
"""
**Continuity** is a Python package for machine learning on function operators.
The package is structured into the following modules:
::cards:: cols=2
- title: Operators
content: Neural operator implementations.
url: operators/index.md
- title: Data
content: Data sets for training.
url: data/index.md
- title: PDE
content: Loss functions for physics-informed training.
url: pde/index.md
- title: Plotting
content: Plotting utilities.
url: plotting/index.md
::/cards::
"""
6 changes: 4 additions & 2 deletions src/continuity/data/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
"""
This defines DataSets in Continuity.
Every data set is a list of (x, u, y, v) tuples.
`continuity.data`
Data sets in Continuity.
Every data set is a list of `(x, u, y, v)` tuples.
"""

import math
Expand Down
6 changes: 3 additions & 3 deletions src/continuity/data/datasets.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,10 @@ class Sine(DataSet):
$$
f(x) = \sin(w_k x), \quad w_k = 1 + \frac{k}{N-1}, \quad k = 0, \dots, N-1.
$$
As a `SelfSupervisedDataset` it exports batches of samples for self-supervised
training, namely
It exports batches of samples
$$
\left(\mathbf{x}, f(\mathbf{x}), \mathbf{x}, f(\mathbf{x})\right),
$$
\left(\mathbf{x}, f(\mathbf{x}), x_j, f(x_j)\right), \quad \text{for } j = 1, \dots, M,
$$
where $\mathbf{x} = (x_i)_{i=1 \dots M}$ are the $M$ equidistantly
distributed sensor positions.
Expand Down
13 changes: 12 additions & 1 deletion src/continuity/operators/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,15 @@
"""Operators in Continuity."""
"""
`continuity.operators`
Operators in Continuity.
Every operator maps collocation points `x`, function values `u`,
and evaluation points `y` to evaluations of `v`:
```
v = operator(x, u, y)
```
"""

from .operator import Operator
from .deeponet import DeepONet
Expand Down
12 changes: 9 additions & 3 deletions src/continuity/operators/losses.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,19 @@
import torch
from torch import Tensor
from abc import abstractmethod
from typing import TYPE_CHECKING

# from continuity.operators.operator import Operator # TODO: Circular import
if TYPE_CHECKING:
from continuity.operators.operator import Operator


class Loss:
"""Loss function for training operators in Continuity."""

@abstractmethod
def __call__(self, op, x: Tensor, u: Tensor, y: Tensor, v: Tensor) -> Tensor:
def __call__(
self, op: "Operator", x: Tensor, u: Tensor, y: Tensor, v: Tensor
) -> Tensor:
"""Evaluate loss.
Args:
Expand All @@ -29,7 +33,9 @@ class MSELoss(Loss):
def __init__(self):
self.mse = torch.nn.MSELoss()

def __call__(self, op, x: Tensor, u: Tensor, y: Tensor, v: Tensor) -> Tensor:
def __call__(
self, op: "Operator", x: Tensor, u: Tensor, y: Tensor, v: Tensor
) -> Tensor:
"""Evaluate MSE loss.
Args:
Expand Down
18 changes: 14 additions & 4 deletions src/continuity/data/losses.py → src/continuity/pde/__init__.py
Original file line number Diff line number Diff line change
@@ -1,16 +1,24 @@
"""Loss functions."""
"""
`continuity.pde`
PDEs in Continuity.
Every PDE is implemented using a physics-informed loss function.
"""

from torch import Tensor
from abc import abstractmethod

# from continuity.operators.operator import Operator # TODO: Circular import
from continuity.operators.operator import Operator


class PDE:
"""PDE base class."""

@abstractmethod
def __call__(self, op, x: Tensor, u: Tensor, y: Tensor, v: Tensor) -> Tensor:
def __call__(
self, op: Operator, x: Tensor, u: Tensor, y: Tensor, v: Tensor
) -> Tensor:
"""Computes PDE loss."""


Expand All @@ -24,7 +32,9 @@ class PhysicsInformedLoss:
def __init__(self, pde: PDE):
self.pde = pde

def __call__(self, op, x: Tensor, u: Tensor, y: Tensor, v: Tensor) -> Tensor:
def __call__(
self, op: Operator, x: Tensor, u: Tensor, y: Tensor, v: Tensor
) -> Tensor:
"""Evaluate loss.
Args:
Expand Down
6 changes: 5 additions & 1 deletion src/continuity/plotting/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
"""Plotting utilities for Continuity."""
"""
`continuity.plotting`
Plotting utilities for Continuity.
"""

import torch
import numpy as np
Expand Down

0 comments on commit 0d9bea5

Please sign in to comment.