singleTaskOptimization
- This is the sharing of my understanding of single task optimization. Imagine there is one product to produce. You need to set the optimized parameters to make the product in a high performance.
Optimization algorithms
- Basis of Optimization
- My solution – Single Task Optimization
- Thinking
Basis of Optimization
1) Summary:
A common optimization algorithm function implementation is provided in the optimize sub-package in scipy. We can call these functions directly to complete our optimization problem. The most typical feature of the function in optimize is to be able to see from the function name what algorithm is used. Below is an overview of the functions in the optimize package:
a) Nonlinear Optimization
fmin – simple Nelder-Mead algorithm
fmin_powell – improved Powell algorithm
fmin_bfgs – Quasi-Newton method
fmin_cg – Nonlinear conjugate gradient method
fmin_ncg – Linear search Newton conjugate gradient method
leastsq – Least squares method
b) Constrained Multivariate Function Problem
fmin_l_bfgs_b —use L-BFGS-B algorithm
fmin_tnc —Gradient information
fmin_cobyla —Linear approximation
fmin_slsqp —Sequence least squares
nnls —solve || Ax - b ||_2 for x>=0
c) Global Optimization
anneal —Simulated annealing algorithm
brute —Brute Force
d) Scalar Function
fminbound
brent
golden
bracket
e) Fitting
curve_fit– Fitting using nonlinear least squares
f) Scalar Function Root
brentq —classic Brent (1973)
brenth —A variation on the classic Brent(1980)
ridder —Ridder is the name of the person created the algorithm
bisect —dichotomy
newton —Newton Method
fixed_point
g) Multidimensional Function Rooting
fsolve —Common
broyden1 —Broyden’s first Jacobian approximation.
broyden2 —Broyden’s second Jacobian approximation
newton_krylov —Krylov approximation for inverse Jacobian
anderson —extended Anderson mixing
excitingmixing —tuned diagonal Jacobian approximation
linearmixing —scalar Jacobian approximation
diagbroyden —diagonal Broyden Jacobian approximation
h) Utility Function
line_search —Find the alpha value that satisfies the strong Wolfe
check_grad —Check the correctness of the gradient function by comparing with the forward finite difference approximation
2) Practical nonlinear optimization
The complete format of calling ‘fmin’ is:
fmin(func, x0, args=(), xtol=0.0001, ftol=0.0001, maxiter=None, maxfun=None, full_output=0, disp=1, retall=0, callback=None)
1 | from scipy.optimize import fmin #import package |
My solution – Single Task Optimization
- Package and methods
- Solution thoughts
1) Package and Methods
Constrained minimization of multivariate scalar functions (minimize)
The minimize function provides algorithms for constrained minimization, namely ‘trust-constr’ , ‘SLSQP’ and ‘COBLYA’. They require the constraints to be defined using slightly different structures. The method ‘trust-constr’ requires the constraints to be defined as a sequence of objects LinearConstraint and NonlinearConstraint. Methods ‘SLSQP’ and ‘COBLYA’, on the other hand, require constraints to be defined as a sequence of dictionaries, with keys type, fun and jac.
Solving the Optimization Problem:
The optimization problem is solved using:1
2
3
4
5
6
7
8x0 = np.array([0.5, 0])
res = minimize(rosen, x0, method='trust-constr', jac=rosen_der, hess=rosen_hess,
... constraints=[linear_constraint, nonlinear_constraint],
... options={'verbose': 1}, bounds=bounds)
# may vary
`gtol` termination condition is satisfied.
Number of iterations: 12, function evaluations: 8, CG iterations: 7, optimality: 2.99e-09, constraint violation: 1.11e-16, execution time: 0.016 s.
print(res.x)
Sequential Least SQuares Programming (SLSQP) Algorithm (method=’SLSQP’)
Both linear and nonlinear constraints are defined as dictionaries with keys type, fun and jac.1
2
3
4
5
6
7
8
9
10ineq_cons = {'type': 'ineq',
... 'fun' : lambda x: np.array([1 - x[0] - 2*x[1],
... 1 - x[0]**2 - x[1],
... 1 - x[0]**2 + x[1]]),
... 'jac' : lambda x: np.array([[1.0, 2.0],
... [-2*x[0], -1.0],
... [-2*x[0], 1.0]])}
eq_cons = {'type': 'eq',
... 'fun' : lambda x: np.array([2*x[0] + x[1] - 1]),
... 'jac' : lambda x: np.array([2.0, 1.0])}
And the optimization problem is solved with:1
2
3
4
5
6
7
8
9
10
11x0 = np.array([0.5, 0])
res = minimize(rosen, x0, method='SLSQP', jac=rosen_der,
... constraints=[eq_cons, ineq_cons], options={'ftol': 1e-9, 'disp': True},
... bounds=bounds)
# may vary
Optimization terminated successfully. (Exit mode 0)
Current function value: 0.342717574857755
Iterations: 5
Function evaluations: 6
Gradient evaluations: 5
print(res.x)
Most of the options available for the method ‘trust-constr’ are not available for ‘SLSQP’.
2) Solution Thoughts
Use case diagram for single-task-opt
Module diagram for single-task-opt
a) Detailed design for single-task-opt
The response analysis algorithm is mainly to achieve parameter optimization and performance estimation. From the perspective of overall business, in order to optimize a single business, single-target response analysis is required.
The main function is divided into six function modules. The main module is to use python’s own minimize function to optimize the response. The other five modules are prepared for the parameters required by the minimize function. The Minimize function minimizes the objective function by a given objective function, the boundary value of the variable, the initial value of the variable, and the minimized method used, such as least squares. Then, according to the result of minimize minimization, the response value and the variable value after the response optimization are obtained by returning the related attributes.
Through statesmodel.formula.api package in python, use the function ols to do regression fitting for selected dataset, then we can get the coefficients of each variable. Then through the coefficients and variables to form a mathematical function that can be used mathematically. Since the response optimization is divided into three categories, maximizing, minimizing, and target-valued response optimization, it is necessary to perform related mathematical processing on the objective function. At this point, the processing of the first parameter objective function of the minimize function ends. Then use the numpy package and pandas package in python to process the data of the variable to get the boundary value and initial value of the variable.
When the minimum value of the objective function is found, it is directly brought into the stitched function to minimize. When the maximum value of the objective function is provided, since Python only provides the minimize function, the inverse function of the stitched function is reversed, and the maximum value of f(x) is obtained by finding the minimum value of -f(x). When the response optimization problem with the target value of the objective function is obtained, the minimum value of the absolute value function of the ‘target value -f(x)’ is converted, so that the response optimization result in which f(x) is infinitely close to the target value can be obtained.
3) Thinking
In many cases, multi-services require that each business achieve an optimal range, so multi-task response analysis is required.
I solve the problem by constructing new objective function, but there are also some other solutions may have better performance.
Sometimes, the objective value has some constraints, so it is needed to do some improvement of the algorithm.
Copyright statement: This article is the original article of the blogger, please attach the blog post link!