Page **2** sur **6**

### Re: nonlinear optimization functions

Publié : **mar. sept. 05, 2017 6:59 am**

par **lukamar**

Yes, it probably will... I have simplified the test. Since there is at most one singularity between two consecutive x-points (say x1 and x2) in most cases (actually I've never seen a counterexample, but I'm not sure that it does not exist), I weakened the test a bit: now it checks only the sign of interpolant's denominator in x1,x2. If den(x1)*den(x2)<0, there's certainly a singularity in (x1,x2). I'll try to find or provide a proof of its uniqueness.

### Re: nonlinear optimization functions

Publié : **mar. sept. 05, 2017 2:11 pm**

par **parisse**

Two warnings issued by clang:

Code : Tout sélectionner

```
optimization.cc:79:17: warning: using directive refers to implicitly-defined namespace 'std'
using namespace std;
^
optimization.cc:914:71: warning: '&&' within '||' [-Wlogical-op-parentheses]
...(k%2 || is_strictly_positive(-pmin,contextptr) && is_strictly_positive(pmax,contextptr)) {
~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
optimization.cc:914:71: note: place parentheses around the '&&' expression to silence this
warning
...|| is_strictly_positive(-pmin,contextptr) && is_strictly_positive(pmax,contextptr)) {
```

For the first one, I just moved using namespace std; after #include<sstream>, for the second I added parenthesis

Code : Tout sélectionner

```
(k%2 || (is_strictly_positive(-pmin,contextptr) && is_strictly_positive(pmax,contextptr)))
```

### Re: nonlinear optimization functions

Publié : **mar. sept. 05, 2017 4:55 pm**

par **lukamar**

Thank you, i have mirrored this in my version of code. g++ wasn't complaining, so I missed it...

### Re: nonlinear optimization functions

Publié : **lun. sept. 11, 2017 12:51 pm**

par **parisse**

I have uploaded binaries unstable version with your commands incorporated. The version of giac was upgraded to 1.4.9. Let me know if I missed something!

### Re: nonlinear optimization functions

Publié : **mer. sept. 13, 2017 9:05 am**

par **lukamar**

Thanks for the upgrade! Everything works fine, except that 'extrema' and 'periodic' are not highlighted as other commands. Could the option keywords starting with 'lp_' for 'lpsolve' be highlighted as well?

I've worked on the code for some more time and I've rewritten some parts used by minimize, maximize and extrema to make the computation process more mathematically coherent.

1. minimize and maximize now use Karush-Kuhn-Tucker conditions to find critical points in multivariate cases. The univariate case is treated separately.

2. I've managed to implement implicit differentiation for multivariate functions (the general case). Therefore, candidates for constrained local extrema can be found without Lagrange multipliers and tested with ordinary, not bordered hessian.

3. Since implicit differentiation is generally useful, I've made it accessible via the new function

**implicitdiff** (see line 986 in optimization.cc) which is compatible to the corresponding Maple command. This one is even more versatile when it comes to input syntax, as it allows differentiating an expression and computing all partial derivatives of the certain order at the same time (with no significant computational cost compared to returning a single derivative), returning them in form of gradient (order=1), hessian (order=2) or table (order>2). However, computation gets tremendously complex for higher derivatives (say, order 10) of a function of three or more variables, so this function can't magically return any derivative one may wish... There's no much help as the number of terms in symbolic equations needed to be solved during the process grows exponentially and significant amount of time is needed to combine them and therefore lower their number (see line 672 in optimization.cc). The function works fine for all examples given in Maple help pages (

https://de.maplesoft.com/support/help/m ... plicitdiff). I've enclosed the corresponding documentation entry.

4. I've made a simple preprocessor for 'solve' called 'solve2' (see line 129 in optimization.cc) which allows for systems of equations which are rational with respect to exp(x),cosh(x),sinh(x) or sin(x),cos(x),tan(x) to be solved. It is done by simple substitution. This enables critical points of e.g. f(x,y)=x+y-exp(x)-exp(y)-exp(x+y), f(x,y)=x^2*sin(y)-4*x or f(x,y)=(1+y*sinh(x))/(1+y^2+tanh(x)^2) to be found analytically.

5. I've compiled optimization.cc and lpsolve.cc using gcc with the '-pedantic' option to clean the code and get rid of unnecessary warnings. That revealed a small bug in 'lpsolve' which blocked assigning the depth limit parameter for branch&bound tree. It is now fixed.

There's a possibility of making classification of critical points in 'extrema' even more rigorous, but it requires some theoretical clarification which I'm working on now.

All examples work with the new version of code attached below.

### Re: nonlinear optimization functions

Publié : **mer. sept. 13, 2017 9:41 am**

par **lukamar**

I forgot to mention that documentation for optimization functions is also refined! There are some minor changes and improvements. The new version is enclosed within optimization-fix8.zip.

### Re: nonlinear optimization functions

Publié : **mer. sept. 13, 2017 12:27 pm**

par **parisse**

Thanks! It's now incorporated in source, and in the Firefox version (without syntax highlighting).

Your implicitdiff code is (much) more complete than mine (implicit_diff command, that I wrote at the request of PocketCAS and HP). I'll try to improve (exact multivariate) solve when I'll have some time (ideally it should prepare input like univariate solve, also handling sqrt for example).

### Re: nonlinear optimization functions

Publié : **mer. sept. 13, 2017 1:54 pm**

par **lukamar**

I optimized the process of implicit differentiation somewhat. I changed the definition of impldiff type, now it is a map. During computing terms in derive_diffterms, they are automatically grouped so sorting is not needed. Now implicitdiff command is noticeably more agile, producing tenth order partial derivatives of x*y*z under constraint -2x^3+15x^2*y+11y^3-24y=0 in just a couple of seconds. But the complexity still rises fast though...

### Re: nonlinear optimization functions

Publié : **mer. sept. 13, 2017 6:30 pm**

par **parisse**

What is the new definition of impldiff? (You forgot optimization.h in the zip file)

### Re: nonlinear optimization functions

Publié : **mer. sept. 13, 2017 6:32 pm**

par **parisse**

### Re: nonlinear optimization functions

Publié : **jeu. sept. 14, 2017 1:01 am**

par **lukamar**

Yes, that's it. Sorry!

### Re: nonlinear optimization functions

Publié : **lun. sept. 25, 2017 9:46 pm**

par **lukamar**

Hello,

I noticed a small error in documentation for implicitdiff, it concerns the last example, in which pd is defined. The last command of that example should be

and not pd[4,0,0]. I've attached the corrected file below. The correction should also be applied to the last example in tooltip help in Xcas.

### Re: nonlinear optimization functions

Publié : **lun. janv. 22, 2018 7:19 am**

par **lukamar**

Below is an updated version of the code with a new function "nlpsolve" added to the collection. It is a replacement for Maple's NLPSolve and it uses an implementation of Cobyla algorithm already available in Giac. It solves a nonlinear programming problem with equality and inequality constraints. Also, cout was replaced with *logptr(contextptr) troughout optimization.cc to make messages generated by optimization functions visible in Xcas.

As Cobyla requires an initial feasible solution, if it's not given it is automatically generated by running the algorithm with a constant objective function under the given constraints using the initial point (1,1,...) or the given infeasible solution. This approach seems to work well, but also may sometimes fail. A nice testing suite of 100+ small NLP problems is available here:

http://www.ai7.uni-bayreuth.de/test_problem_coll.pdf. I've tried problems 5, 7, 19, 34, 36 and 45. Sometimes a modification of an initial guess is needed (Cobyla does not accept zero as an initial value for a variable).

The documentation entry for nlpsolve is included in the file below. As Cobyla is not using derivatives but instead approximates a nonlinear problem with a successive LP problems, I suggest it to be inserted immediately after lpsolve in cascmd_en.tex.

### Re: nonlinear optimization functions

Publié : **lun. janv. 22, 2018 9:58 am**

par **parisse**

Thanks!

I have successfully updated optimization.cc/.h but could not update the documentation, because you forgot to include it in the zip file

### Re: nonlinear optimization functions

Publié : **lun. janv. 22, 2018 10:15 am**

par **lukamar**

I forgot indeed, here it is now.