@@ -19,39 +19,28 @@ Let's optimize the classical _Rosenbrock function_ in two dimensions.
19
19
``` @example optimization
20
20
using ModelingToolkit, Optimization, OptimizationOptimJL
21
21
@variables begin
22
- x, [bounds = (-2.0, 2.0), guess = 1.0 ]
23
- y, [bounds = (-1.0, 3.0), guess = 3.0 ]
22
+ x = 1.0 , [bounds = (-2.0, 2.0)]
23
+ y = 3.0 , [bounds = (-1.0, 3.0)]
24
24
end
25
- @parameters a=1 b=1
25
+ @parameters a=1.0 b=1.0
26
26
rosenbrock = (a - x)^2 + b * (y - x^2)^2
27
27
@mtkbuild sys = OptimizationSystem(rosenbrock, [x, y], [a, b])
28
28
```
29
29
30
30
Every optimization problem consists of a set of optimization variables.
31
- In this case, we create two variables: ` x ` and ` y ` .
31
+ In this case, we create two variables: ` x ` and ` y ` ,
32
+ with initial guesses ` 1 ` and ` 3 ` for their optimal values.
32
33
Additionally, we assign box constraints for each of them, using ` bounds ` ,
33
- as well as an initial guess for their optimal values, using ` guess ` .
34
- Both bounds and guess are called symbolic metadata.
34
+ Bounds is an example of symbolic metadata.
35
35
Fore more information, take a look at the symbolic metadata
36
- [ documentation page] ( symbolic_metadata ) .
36
+ [ documentation page] (@ ref symbolic_metadata).
37
37
38
38
We also create two parameters with ` @parameters ` .
39
39
Parameters are useful if you want to solve the same optimization problem multiple times,
40
40
with different values for these parameters.
41
41
Default values for these parameters can also be assigned, here ` 1 ` is used for both ` a ` and ` b ` .
42
42
These optimization values and parameters are used in an objective function, here the Rosenbrock function.
43
43
44
- A visualization of the Rosenbrock function is depicted below.
45
-
46
- ``` @example optimization
47
- using Plots
48
- x_plot = -2:0.01:2
49
- y_plot = -1:0.01:3
50
- contour(
51
- x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
52
- ratio = :equal, xlims = (-2, 2))
53
- ```
54
-
55
44
Next, the actual ` OptimizationProblem ` can be created.
56
45
The initial guesses for the optimization variables can be overwritten, via an array of ` Pairs ` ,
57
46
in the second argument of ` OptimizationProblem ` .
@@ -64,10 +53,20 @@ u0 = [y => 2.0]
64
53
p = [b => 100.0]
65
54
66
55
prob = OptimizationProblem(sys, u0, p, grad = true, hess = true)
67
- solve(prob, GradientDescent())
56
+ u_opt = solve(prob, GradientDescent())
68
57
```
69
58
70
- We see that the optimization result corresponds to the minimum in the figure.
59
+ A visualization of the Rosenbrock function is depicted below.
60
+
61
+ ``` @example optimization
62
+ using Plots
63
+ x_plot = -2:0.01:2
64
+ y_plot = -1:0.01:3
65
+ contour(
66
+ x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
67
+ ratio = :equal, xlims = (-2, 2))
68
+ scatter!([u_opt[1]], [u_opt[2]], ms = 10, label = "minimum")
69
+ ```
71
70
72
71
## Rosenbrock Function with Constraints
73
72
@@ -79,10 +78,10 @@ Let's add an inequality constraint to the previous example:
79
78
using ModelingToolkit, Optimization, OptimizationOptimJL
80
79
81
80
@variables begin
82
- x, [bounds = (-2.0, 2.0), guess = 1.0 ]
83
- y, [bounds = (-1.0, 3.0), guess = 2.0 ]
81
+ x = 0.14 , [bounds = (-2.0, 2.0)]
82
+ y = 0.14 , [bounds = (-1.0, 3.0)]
84
83
end
85
- @parameters a=1 b=100
84
+ @parameters a=1.0 b=100.0
86
85
rosenbrock = (a - x)^2 + b * (y - x^2)^2
87
86
cons = [
88
87
x^2 + y^2 ≲ 1
0 commit comments