Skip to content

Commit

Permalink
format
Browse files Browse the repository at this point in the history
  • Loading branch information
Vaibhavdixit02 committed Oct 3, 2024
1 parent 63cf8cb commit 2b22dfe
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 7 deletions.
10 changes: 4 additions & 6 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,7 @@
# v4 Breaking changes

1. The main change in this breaking release has been the way mini-batching is handled. The data argument in the solve call and the implicit iteration of that in the callback has been removed,
the stochastic solvers (Optimisers.jl and Sophia) now handle it explicitly. You would now pass in a DataLoader to OptimizationProblem as the second argument to the objective etc (p) if you
want to do minibatching, else for full batch just pass in the full data.
1. The main change in this breaking release has been the way mini-batching is handled. The data argument in the solve call and the implicit iteration of that in the callback has been removed,
the stochastic solvers (Optimisers.jl and Sophia) now handle it explicitly. You would now pass in a DataLoader to OptimizationProblem as the second argument to the objective etc (p) if you
want to do minibatching, else for full batch just pass in the full data.

2. The support for extra returns from objective function has been removed. Now the objective should only return a scalar loss value, hence callback doesn't take extra arguments other than the state and loss value.


2. The support for extra returns from objective function has been removed. Now the objective should only return a scalar loss value, hence callback doesn't take extra arguments other than the state and loss value.
13 changes: 13 additions & 0 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -171,9 +171,11 @@ to add the specific wrapper packages.
- Unconstrained: ✅
</details>
```

🟡 = supported in downstream library but not yet implemented in `Optimization.jl`; PR to add this functionality are welcome

## Citation

```
@software{vaibhav_kumar_dixit_2023_7738525,
author = {Vaibhav Kumar Dixit and Christopher Rackauckas},
Expand All @@ -185,37 +187,48 @@ to add the specific wrapper packages.
url = {https://doi.org/10.5281/zenodo.7738525},
year = 2023}
```

## Reproducibility

```@raw html
<details><summary>The documentation of this SciML package was built using these direct dependencies,</summary>
```

```@example
using Pkg # hide
Pkg.status() # hide
```

```@raw html
</details>
```

```@raw html
<details><summary>and using this machine and Julia version.</summary>
```

```@example
using InteractiveUtils # hide
versioninfo() # hide
```

```@raw html
</details>
```

```@raw html
<details><summary>A more complete overview of all dependencies and their versions is also provided.</summary>
```

```@example
using Pkg # hide
Pkg.status(; mode = PKGMODE_MANIFEST) # hide
```

```@raw html
</details>
```

```@eval
using TOML
using Markdown
Expand Down
1 change: 0 additions & 1 deletion lib/OptimizationOptimJL/src/OptimizationOptimJL.jl
Original file line number Diff line number Diff line change
Expand Up @@ -292,7 +292,6 @@ function SciMLBase.__solve(cache::OptimizationCache{
else
fg! = cache.f.fg
end


gg = function (G, θ)
cache.f.grad(G, θ)
Expand Down

0 comments on commit 2b22dfe

Please sign in to comment.