Skip to content

Commit bd4783c

Browse files
committed
remove guess metadata, use default value instead
1 parent 7cd30a9 commit bd4783c

File tree

1 file changed

+22
-23
lines changed

1 file changed

+22
-23
lines changed

docs/src/tutorials/optimization.md

Lines changed: 22 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -19,39 +19,28 @@ Let's optimize the classical _Rosenbrock function_ in two dimensions.
1919
```@example optimization
2020
using ModelingToolkit, Optimization, OptimizationOptimJL
2121
@variables begin
22-
x, [bounds = (-2.0, 2.0), guess = 1.0]
23-
y, [bounds = (-1.0, 3.0), guess = 3.0]
22+
x = 1.0, [bounds = (-2.0, 2.0)]
23+
y = 3.0, [bounds = (-1.0, 3.0)]
2424
end
25-
@parameters a=1 b=1
25+
@parameters a=1.0 b=1.0
2626
rosenbrock = (a - x)^2 + b * (y - x^2)^2
2727
@mtkbuild sys = OptimizationSystem(rosenbrock, [x, y], [a, b])
2828
```
2929

3030
Every optimization problem consists of a set of optimization variables.
31-
In this case, we create two variables: `x` and `y`.
31+
In this case, we create two variables: `x` and `y`,
32+
with initial guesses `1` and `3` for their optimal values.
3233
Additionally, we assign box constraints for each of them, using `bounds`,
33-
as well as an initial guess for their optimal values, using `guess`.
34-
Both bounds and guess are called symbolic metadata.
34+
Bounds is an example of symbolic metadata.
3535
Fore more information, take a look at the symbolic metadata
36-
[documentation page](symbolic_metadata).
36+
[documentation page](@ref symbolic_metadata).
3737

3838
We also create two parameters with `@parameters`.
3939
Parameters are useful if you want to solve the same optimization problem multiple times,
4040
with different values for these parameters.
4141
Default values for these parameters can also be assigned, here `1` is used for both `a` and `b`.
4242
These optimization values and parameters are used in an objective function, here the Rosenbrock function.
4343

44-
A visualization of the Rosenbrock function is depicted below.
45-
46-
```@example optimization
47-
using Plots
48-
x_plot = -2:0.01:2
49-
y_plot = -1:0.01:3
50-
contour(
51-
x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
52-
ratio = :equal, xlims = (-2, 2))
53-
```
54-
5544
Next, the actual `OptimizationProblem` can be created.
5645
The initial guesses for the optimization variables can be overwritten, via an array of `Pairs`,
5746
in the second argument of `OptimizationProblem`.
@@ -64,10 +53,20 @@ u0 = [y => 2.0]
6453
p = [b => 100.0]
6554
6655
prob = OptimizationProblem(sys, u0, p, grad = true, hess = true)
67-
solve(prob, GradientDescent())
56+
u_opt = solve(prob, GradientDescent())
6857
```
6958

70-
We see that the optimization result corresponds to the minimum in the figure.
59+
A visualization of the Rosenbrock function is depicted below.
60+
61+
```@example optimization
62+
using Plots
63+
x_plot = -2:0.01:2
64+
y_plot = -1:0.01:3
65+
contour(
66+
x_plot, y_plot, (x, y) -> (1 - x)^2 + 100 * (y - x^2)^2, fill = true, color = :viridis,
67+
ratio = :equal, xlims = (-2, 2))
68+
scatter!([u_opt[1]], [u_opt[2]], ms = 10, label = "minimum")
69+
```
7170

7271
## Rosenbrock Function with Constraints
7372

@@ -79,10 +78,10 @@ Let's add an inequality constraint to the previous example:
7978
using ModelingToolkit, Optimization, OptimizationOptimJL
8079
8180
@variables begin
82-
x, [bounds = (-2.0, 2.0), guess = 1.0]
83-
y, [bounds = (-1.0, 3.0), guess = 2.0]
81+
x = 0.14, [bounds = (-2.0, 2.0)]
82+
y = 0.14, [bounds = (-1.0, 3.0)]
8483
end
85-
@parameters a=1 b=100
84+
@parameters a=1.0 b=100.0
8685
rosenbrock = (a - x)^2 + b * (y - x^2)^2
8786
cons = [
8887
x^2 + y^2 ≲ 1

0 commit comments

Comments
 (0)