Skip to content

Commit

Permalink
initial release
Browse files Browse the repository at this point in the history
  • Loading branch information
sshin23 committed Jul 30, 2023
1 parent 8aede90 commit c52f90c
Show file tree
Hide file tree
Showing 33 changed files with 622 additions and 554 deletions.
16 changes: 16 additions & 0 deletions .github/workflows/CompatHelper.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
name: CompatHelper
on:
schedule:
- cron: '00 00 * * *'
workflow_dispatch:
jobs:
CompatHelper:
runs-on: ubuntu-latest
steps:
- name: Pkg.add("CompatHelper")
run: julia -e 'using Pkg; Pkg.add("CompatHelper")'
- name: CompatHelper.main()
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
COMPATHELPER_PRIV: ${{ secrets.COMPATHELPER_PRIV }} # optional
run: julia -e 'using CompatHelper; CompatHelper.main()'
15 changes: 15 additions & 0 deletions .github/workflows/TagBot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
name: TagBot
on:
issue_comment:
types:
- created
workflow_dispatch:
jobs:
TagBot:
if: github.event_name == 'workflow_dispatch' || github.actor == 'JuliaTagBot'
runs-on: ubuntu-latest
steps:
- uses: JuliaRegistries/TagBot@v1
with:
token: ${{ secrets.GITHUB_TOKEN }}
ssh: ${{ secrets.DOCUMENTER_KEY }}
24 changes: 24 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: Documentation

on:
push:
branches:
- main # update to match your development branch (master, main, dev, trunk, ...)
tags: '*'
pull_request:

jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@latest
with:
version: '1.9'
- name: Install dependencies
run: julia --project=docs/ -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd())); Pkg.instantiate()'
- name: Build and deploy
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # If authenticating with GitHub Actions token
DOCUMENTER_KEY: ${{ secrets.DOCUMENTER_KEY }} # If authenticating with SSH deploy key
run: julia --project=docs/ docs/make.jl
25 changes: 25 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
name: build

on: [push, pull_request]

jobs:
test:
runs-on: ${{ matrix.os }}
strategy:
matrix:
julia-version: ['1.9']
julia-arch: [x64]
os: [ubuntu-latest,macos-latest,windows-latest]

steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@latest
with:
version: ${{ matrix.julia-version }}
- uses: julia-actions/julia-buildpkg@latest
- uses: julia-actions/julia-runtest@latest
- uses: julia-actions/julia-processcoverage@v1
- uses: codecov/codecov-action@v1
with:
file: lcov.info
token: ${{ secrets.CODECOV_TOKEN }}
24 changes: 13 additions & 11 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,26 +6,28 @@ version = "0.1.0"
[deps]
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"

[compat]
julia = "1.9"
KernelAbstractions = "0.9"

[weakdeps]
SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"
KernelAbstractions = "63c18a36-062a-441e-b654-da1e3ab1ce7c"
CUDA = "052768ef-5323-5732-b1bb-66c8b64840ba"
KernelAbstractions = "63c18a36-062a-441e-b654-da1e3ab1ce7c"
SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"
oneAPI = "8f75cd03-7ff8-4ecb-9b8f-daf728133b1b"

[extensions]
SIMDiffSpecialFunctions = "SpecialFunctions"
SIMDiffKernelAbstractions = "KernelAbstractions"
SIMDiffCUDA = "CUDA"
SIMDiffKernelAbstractions = "KernelAbstractions"
SIMDiffOneAPI = "oneAPI"
SIMDiffSpecialFunctions = "SpecialFunctions"

[compat]
KernelAbstractions = "0.9"
julia = "1.9"

[extras]
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
NLPModelsIpopt = "f4238b75-b362-5c4c-b852-0801c9a21d71"
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"
ADNLPModels = "54578032-b7ea-4c30-94aa-7cbd1cce6c9a"
NLPModelsIpopt = "f4238b75-b362-5c4c-b852-0801c9a21d71"
SIMDiffExamples = "ff8351d9-12a3-4c2d-a61a-51dfbae68567"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["Test", "NLPModelsIpopt", "ADNLPModels"]
test = ["Test", "NLPModels", "NLPModelsIpopt", "ADNLPModels", "KernelAbstractions", "CUDA"]
17 changes: 17 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1 +1,18 @@
# SIMDiff.jl
*An implementation of SIMD abstraction for nonlinear programs and automatic differentiation.*

| **License** | **Documentation** | **Build Status** | **Coverage** | **Citation** |
|:-----------------:|:----------------:|:----------------:|:----------------:|:----------------:|
| [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) | [![doc](https://img.shields.io/badge/docs-dev-blue.svg)](https://sshin23.github.io/SIMDiff.jl/) | [![build](https://github.com/sshin23/SIMDiff.jl/actions/workflows/test.yml/badge.svg)](https://github.com/sshin23/SIMDiff.jl/actions/workflows/test.yml) | [![codecov](https://codecov.io/gh/sshin23/SIMDiff.jl/branch/main/graph/badge.svg?token=8ViJWBWnZt)](https://codecov.io/gh/sshin23/SIMDiff.jl) |

## Introduction
SIMDiff.jl employs what we call **SIMD abstraction for nonlinear programs** (NLPs), which allows for the **preservation of the parallelizable structure** within the model equations, facilitating **efficient, parallel derivative evaluations** on the **GPU**.

SIMDiff.jl is different from other algebraic modeling tools, such as JuMP or AMPL, in the following ways:
- **Modeling Interface**: SIMDiff.jl enforces users to specify the model equations always in the form of `Iterable`s. This allows SIMDiff.jl to preserve the SIMD-compatible structure in the model equations.
- **Performance**: SIMDiff.jl compiles (via Julia's compiler) derivative evaluation codes that are specific to each computation pattern, based on reverse-mode automatic differentiation. This makes the speed of derivative evaluation (even on the CPU) significantly faster than other existing tools.
- **Portability**: SIMDiff.jl can evaluate derivatives on GPU accelerators. The code is currently only tested for NVIDIA GPUs, but GPU code is implemented mostly based on the portable programming paradigm, KernelAbstractions.jl. In the future, we are interested in supporting Intel, AMD, and Apple GPUs.

## Supporting SIMDiff.jl
- Please report issues and feature requests via the [GitHub issue tracker](https://github.com/sshin/SIMDiff.jl/issues).
- Questions are welcome at [GitHub discussion forum](https://github.com/sshin23/SIMDiff.jl/discussions).
20 changes: 7 additions & 13 deletions docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,20 +1,14 @@
using Documenter, MadDiff, Literate
using Documenter, SIMDiff, Literate

const _PAGES = [
"Introduction" => "index.md",
"Quick Start"=>"guide.md",
"How it Works" => "tutorial.md",
"API Manual" => [
"MadDiffCore" => "core.md",
"MadDiffSpecialFunctions" => "special.md",
"MadDiffModels" => "models.md",
"MadDiffMOI" => "moi.md",
]
"API Manual" => "core.md",
]

const _JL_FILENAMES = [
"guide.jl",
"tutorial.jl"
# "tutorial.jl"
]

for jl_filename in _JL_FILENAMES
Expand All @@ -30,15 +24,15 @@ end


makedocs(
sitename = "MadDiff",
sitename = "SIMDiff.jl",
authors = "Sungho Shin",
format = Documenter.LaTeX(platform="docker"),
pages = _PAGES
)

makedocs(
sitename = "MadDiff",
modules = [MadDiff],
sitename = "SIMDiff.jl",
modules = [SIMDiff],
authors = "Sungho Shin",
format = Documenter.HTML(
prettyurls = get(ENV, "CI", nothing) == "true",
Expand All @@ -51,6 +45,6 @@ makedocs(


deploydocs(
repo = "github.com/sshin23/MadDiff.jl.git"
repo = "github.com/sshin23/SIMDiff.jl.git"
)

Empty file removed docs/src/algorithms.md
Empty file.
4 changes: 2 additions & 2 deletions docs/src/core.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# MadDiffCore
# SIMDiff
```@autodocs
Modules = [MadDiffCore]
Modules = [SIMDiff]
```
41 changes: 18 additions & 23 deletions docs/src/guide.jl
Original file line number Diff line number Diff line change
@@ -1,46 +1,41 @@
# # Getting Started
# SIMDiff provides a built-in API for creating nonlinear prgogramming models and allows solving the created models using NLP solvers (in particular, those that are interfaced with `NLPModels`, such as [NLPModelsIpopt](https://github.com/JuliaSmoothOptimizers/NLPModelsIpopt.jl). We now use `SIMDiff`'s bulit-in API to model the following nonlinear program:
# SIMDiff can create nonlinear prgogramming models and allows solving the created models using NLP solvers (in particular, those that are interfaced with `NLPModels`, such as [NLPModelsIpopt](https://github.com/JuliaSmoothOptimizers/NLPModelsIpopt.jl). We now use `SIMDiff` to model the following nonlinear program:
# ```math
# \begin{aligned}
# \min_{\{x_i\}_{i=0}^N} &\sum_{i=2}^N 100(x_{i-1}^2-x_i)^2+(x_{i-1}-1)^2\\
# \text{s.t.} & 3x_{i+1}^3+2x_{i+2}-5+\sin(x_{i+1}-x_{i+2})\sin(x_{i+1}+x_{i+2})+4x_{i+1}-x_i e^{x_i-x_{i+1}}-3 = 0
# \end{aligned}
# ```
# We model the problem with:
using SIMDiff

# We set
N = 10000

# First, we create a `SIMDiffModel`.
m = SIMDiffModel()
# First, we create a `SIMDiff.Core`.
c = SIMDiff.Core()

# The variables can be created as follows:
x = [variable(m; start = mod(i,2)==1 ? -1.2 : 1.) for i=1:N];

x = SIMDiff.variable(
c, N;
start = (mod(i,2)==1 ? -1.2 : 1. for i=1:N)
)

# The objective can be set as follows:
objective(m, sum(100(x[i-1]^2-x[i])^2+(x[i-1]-1)^2 for i=2:N));
SIMDiff.objective(c, 100*(x[i-1]^2-x[i])^2+(x[i-1]-1)^2 for i in 2:N)

# The constraints can be set as follows:
for i=1:N-2
constraint(m, 3x[i+1]^3+2*x[i+2]-5+sin(x[i+1]-x[i+2])sin(x[i+1]+x[i+2])+4x[i+1]-x[i]exp(x[i]-x[i+1])-3 == 0);
end
SIMDiff.constraint(
c,
3x[i+1]^3+2*x[i+2]-5+sin(x[i+1]-x[i+2])sin(x[i+1]+x[i+2])+4x[i+1]-x[i]exp(x[i]-x[i+1])-3
for i in 1:N-2)

# The important last step is instantiating the model. This step must be taken before calling optimizers.
instantiate!(m)
# Finally, we create an NLPModel.
m = SIMDiff.Model(c)

# To solve the problem with `Ipopt`,
using NLPModelsIpopt
sol = ipopt(m);

# The solution `sol` contains the field `sol.solution` holding the optimized parameters.

# ### SIMDiff as an AD backend of JuMP
# SIMDiff can be used as an automatic differentiation backend of JuMP. The problem above can be modeled in `JuMP` and solved with `Ipopt` along with `SIMDiff`

using JuMP, Ipopt

m = JuMP.Model(Ipopt.Optimizer)

@variable(m, x[i=1:N], start=mod(i,2)==1 ? -1.2 : 1.)
@NLobjective(m, Min, sum(100(x[i-1]^2-x[i])^2+(x[i-1]-1)^2 for i=2:N))
@NLconstraint(m, [i=1:N-2], 3x[i+1]^3+2*x[i+2]-5+sin(x[i+1]-x[i+2])sin(x[i+1]+x[i+2])+4x[i+1]-x[i]exp(x[i]-x[i+1])-3 == 0)

optimize!(m; differentiation_backend = SIMDiffAD())
Loading

2 comments on commit c52f90c

@sshin23
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/88683

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.1.0 -m "<description of version>" c52f90c8ace456ed738bb70d64d41f4fbb738953
git push origin v0.1.0

Please sign in to comment.