Skip to content

Releases: linjing-lab/optimtool

optimtool-2.5.2

19 Oct 09:59
Compare
Choose a tag to compare

Fixed Bug:

  • add assertion in _search/ZhangHanger for users to adjust default parameters by AssertionError with alpha > 0.

New Example:

import optimtool.unconstrain as ou
from optimtool.base import sp
x = sp.symbols("x1:5")
f = 100 * (x[1] - x[0]**2)**2 + \
    (1 - x[0])**2 + \
    100 * (x[3] - x[2]**2)**2 + \
    (1 - x[2])**2
x_0 = (-1.2, 1, -1.2, 1)
barzilar_borwein = ou.gradient_descent.barzilar_borwein
barzilar_borwein(f, x, x_0, verbose=True, method="ZhangHanger", c1=0.8, beta=0.8, eta=0.6)

see tests and examples for fine-tuning default parameters of more algorithms.

optimtool-2.5.1

09 Oct 02:41
Compare
Choose a tag to compare

Fixed Bug:

  • step size of Lasso in example should conform to lipschitz continuity condition, same as tk in any module of hybrid file.
  • test lagrange_augmented algorithm and correct the name of inner gradient normalized epsilon with default parameter.
import optimtool.constrain as oc
from optimtool.base import sp
f, x1, x2 = sp.symbols("f x1 x2")
f = (x1 - 2)**2 + (x2 - 1)**2
c1 = x1 - x2 - 1
c2 = 0.25*x1**2 - x2 - 1
oc.mixequal.lagrange_augmentedm(f, (x1, x2), c1, c2, (1., 0.), verbose=True)

see tests and examples for a full ecology of optimization.

optimtool-2.5.0

05 Oct 09:36
Compare
Choose a tag to compare

New Traits:

  • upgrade Development Status to Production/Stable instead of Beta.
  • add hybrid methods about FISTA, Nesterov to solve a new optimization issue.
  • test and verify all executable algorithms in newly developed component.

Hybrid Optimization:

import optimtool.hybrid as oh
from optimtool.base import sp
x = sp.symbols("x1:3")
f = (2 - (sp.cos(x[0]) + sp.cos(x[1])) + (1 - sp.cos(x[0])) - sp.sin(x[0]))**2 + \
    (2 - (sp.cos(x[0]) + sp.cos(x[1])) + 2 * (1 - sp.cos(x[1])) - sp.sin(x[1]))**2
x_0 = (0.2, 0.2) # Random given
oh.fista.normal(f, x, x_0, verbose=True, epsilon=1e-4)

see tests and examples for more compatible use, hope more issues will be discussed.

optimtool-2.5.0-pre

03 Oct 18:38
Compare
Choose a tag to compare

New Traits:

  • more robust detection for illegal input, support FuncArray, ArgArray, PointArray with optimtool._typing.py.
  • add base.py to make numpy, sympy, matplotlib.pyplot as the topest level of optimtool to convenient operations.
  • support any method of optimtool to print the details about point, f, k with verbose defaulted with bool False.
  • optimize the __doc__ of each method into en, write more logical context for users to execute experiments.
  • retest algorithms with configured parameters to prepare for the next version about hybrid and existed files.

Import:

import optimtool as oo
from optimtool.base import np, sp, plt
# see optimtool/base.py

see tests and examples for more compatible use, users can expand hybrid methods according to _proxim.py and need.

optimtool-2.4.4

01 Jun 12:05
Compare
Choose a tag to compare

Simple Case:

import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") # Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) # funcs, args, x_0

Bugs Fixed:

  • update _kernel for more concise and comprehensive kernel selectors: kernel, linear_search, nonmonotonic_search
  • add the setting variable of unconstrained break precision for constrained optimization method, like penalty_quadratic.
import optimtool.constrain as oc
f, x1, x2 = sp.symbols("f x1 x2")
f = (x1 - 2)**2 + (x2 - 1)**2
c1 = x1 - x2 - 1
c2 = 0.25*x1**2 - x2 - 1
oc.mixequal.penalty_L1(f, (x1, x2), c1, c2, (1.5, 0.5), epsk=1e-4) # use `epsk` to set break epsilon of `kernel`

introduce to hybrid: (will be upload to optimtool in v2.5.0)

delta = x_0 - tk * gradient # gradient=f(x).jacobian, g(x) is not differentiable.

proximity operators and iteration:

x_0 = np.sign(delta) * np.max(np.abs(delta) - tk, 0) # L1
norm = np.linalg.norm(delta)
x_0 = (1 - tk / norm) * delta if norm > tk else 0 # L2
x_0 = (delta + np.sqrt(delta**2 + 4 * tk)) / 2 # -\sum_{i=1}^{n}\ln(xi), n=len(x_0)

Reference:

optimtool-2.4.3

01 Jun 09:55
Compare
Choose a tag to compare

Simple Case:

import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") # Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) # funcs, args, x_0

Bugs Fixed:

  • update _convert/h2h: reduce corrected times of hessian matrix
import optimtool.unconstrain as ou
ou.newton.modified(f, [x1, x2, x3, x4], (1, 2, 3, 4)) # funcs, args, x_0

Reference:

optimtool-2.4.2

17 Apr 15:20
Compare
Choose a tag to compare

Simple Case:

import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") # Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) # funcs, args, x_0

Bugs Fixed:

  • update _convert/h2h: all eigenvalues of the hessian > 0 → rank of matrix == n.
  • simplify assignment when setting the initialized space for search, point, and f.
  • reduced redundant assignment of iteration points in some methods, like trust_region/steihaug_CG and etc.
  • select trust_region method serving as the default configuration of the constrained optimization.

Reference:

optimtool-2.4.1

10 Nov 06:21
Compare
Choose a tag to compare

Simple Case:

import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") # Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) # funcs, args, x_0

optimtool-2.4.0

08 Nov 13:37
Compare
Choose a tag to compare

Simple Case:

import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") # Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) # funcs, args, x_0

Use FuncArray, ArgArray, PointArray, IterPointType, OutputType in typing, and delete functions/ folder. I use many means to accelerate the method, I can't enumerate them here.

optimtool-2.3.5

25 Apr 05:48
Compare
Choose a tag to compare

In v2.3.4, We call a method as follows:

import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4")
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
funcs = sp.Matrix([f])
args = sp.Matrix([x1, x2, x3, x4])
x_0 = (1, 2, 3, 4)
oo.unconstrain.gradient_descent.barzilar_borwein(funcs, args, x_0)

But in v2.3.5, We now call a method as follows: (It reduces the trouble of constructing data externally.)

import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") # Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) # funcs, args, x_0
# funcs(args) can be list, tuple, sp.Matrix

functional parameters of bulit-in method are similar to MATLAB Optimization Tool, and supports more methods than it.