nonlinear optimization functions

Librairie C++ de calcul formel/ C++ symbolic computation library

Modérateur : xcasadmin

parisse
Messages : 5734
Inscription : mar. déc. 20, 2005 4:02 pm
Contact :

Re: nonlinear optimization functions

Message par parisse » lun. janv. 22, 2018 12:14 pm

Latex source now updated, thanks!

lukamar
Messages : 331
Inscription : ven. juin 30, 2017 9:55 am
Localisation : Zagreb, Croatia

Re: nonlinear optimization functions

Message par lukamar » lun. janv. 22, 2018 3:47 pm

Thank you very much. I also forgot to mention that there are new keywords neeed for nlpsolve: nlp_initialpoint, nlp_iterationlimit, nlp_nonnegative, nlp_precision and nlp_maximize (defined in optimization.h). lp_ and nlp_maximize are actually not needed any more as the keyword "maximize" is available, but I suggest to keep them for convenience.

lukamar
Messages : 331
Inscription : ven. juin 30, 2017 9:55 am
Localisation : Zagreb, Croatia

Re: nonlinear optimization functions

Message par lukamar » mar. janv. 30, 2018 2:08 am

I corrected a bug in optimization.cc and also rewrote several small portions of the code. The bug was in the univariate case of minimize/maximize with function f(x)=exp(x-x^2) for x=-1..1. It turns out that the construction below fails to give the result:

Code : Tout sélectionner

context ct;
assume_t_in_ab(x,-1,1,false,false,&ct);
gen df=_derive(makesequence(giac::exp(x-pow(x,2),&ct),x),&ct);
cout << _solve(makesequence(df,x),&ct) << endl;
But if I replace "solve" with "zeros", I get the correct solution x=1/2. Hence I just replaced solve with zeros in the univariate case for now. But why does "solve" fail to solve such a simple equation? It has something to do with assumptions since when assume_t_in_ab line is removed "solve" works as expected.
Pièces jointes
optimization-fix11.zip
(24.36 Kio) Téléchargé 162 fois

parisse
Messages : 5734
Inscription : mar. déc. 20, 2005 4:02 pm
Contact :

Re: nonlinear optimization functions

Message par parisse » mar. janv. 30, 2018 7:49 am

Thanks for reporting, there was indeed something to fix in solve.cc. I have updated my source with your changes.

lukamar
Messages : 331
Inscription : ven. juin 30, 2017 9:55 am
Localisation : Zagreb, Croatia

Re: nonlinear optimization functions

Message par lukamar » mer. janv. 31, 2018 7:29 am

Thank you for updating. Note that the bug causing the error in the minimal example above does not emerge when using Xcas: the commands

Code : Tout sélectionner

assume(x>=-1 and x<=1); solve(diff(exp(x-x^2),x)=0,x)
give the expected result.

lukamar
Messages : 331
Inscription : ven. juin 30, 2017 9:55 am
Localisation : Zagreb, Croatia

Re: nonlinear optimization functions

Message par lukamar » jeu. févr. 08, 2018 9:09 am

I have added one more interpolation command in addition to thiele, it's called triginterp and it returns a trigonometric polynomial interpolation of the given data. See the enclosed tex file for the details of usage. The function itself is very simple as it assumes that the data points are equally spaced with respect to the abscissa. The function is inspired by the exercise in fft from cascmd_fr. My implementation does not use fft but rather a Dirichlet kernel in order to return exact coefficients for an exact data. Now with lagrange (spline), thiele and triginterp commands Giac can do polynomial (spline), rational and trigonometric interpolation out of the box.
I'm not, however, quite sure about what function to choose for simplification of the coefficients. The code for the time being uses _simplify (which is probably not recommended since it can stuck for a while, but this is a very simple case), which may return rootofs. Is it better to stick with ratnormal instead? Simplification is done in the lines 2419 and 2420.
Pièces jointes
optimization-fix12.zip
(26.74 Kio) Téléchargé 162 fois

parisse
Messages : 5734
Inscription : mar. déc. 20, 2005 4:02 pm
Contact :

Re: nonlinear optimization functions

Message par parisse » jeu. févr. 08, 2018 7:29 pm

I will insert it with an easy switch :

Code : Tout sélectionner

#if 1
        tp+=_simplify(c*ak,contextptr)*cos(_ratnormal((j+1)*twopi/T,contextptr)*x,contextptr);
        tp+=_simplify(c*bk,contextptr)*sin(_ratnormal((j+1)*twopi/T,contextptr)*x,contextptr);
#else
        tp+=_ratnormal(c*ak,contextptr)*cos(_ratnormal((j+1)*twopi/T,contextptr)*x,contextptr);
        tp+=_ratnormal(c*bk,contextptr)*sin(_ratnormal((j+1)*twopi/T,contextptr)*x,contextptr);
#endif
We'll see if there are complains about rootofs or endless simplifications then we will switch to ratnormal...

lukamar
Messages : 331
Inscription : ven. juin 30, 2017 9:55 am
Localisation : Zagreb, Croatia

Re: nonlinear optimization functions

Message par lukamar » lun. juil. 16, 2018 2:58 pm

Hi Bernard,
I've made some fixes to optimization.cc/.h, the new version is available here: https://github.com/marohnicluka/optimization. Changes:
- implicit differentiation routines are organized within class called ipdiff, the code is slightly more efficient
- fixed bug in extrema command, it caused some critical points to remain undiscovered
- added new option 'lagrange' to extrema command, which forces the use of Lagrange multipliers
- MODI method is now available via tprob class for balanced transportation problem

Also, I've written a manuscript covering the theoretical background for the algorithm used by extrema command. It also explains how implicit differentiation is implemented. It is located in the doc directory.

Edit: here is the modified entry for cascmd_en.tex.

Code : Tout sélectionner

\subsection{Local extrema: {\tt extrema}}\index{extrema|textbf}

Local extrema of a univariate or multivariate differentiable function under equality constraints can be obtained by using function {\tt extrema} which takes four arguments :
\begin{itemize}
\item {\tt expr} : differentiable expression
\item {\tt constr} (optional) : list of equality constraints
\item {\tt vars} : list of variables
\item {\tt order\_size=<positive integer>} or {\tt lagrange} (optional) : upper bound for the order of derivatives examined in the process (defaults to 5) or the specifier for the method of Lagrange multipliers
\end{itemize}
Function returns sequence of two lists of points: local minima and maxima, respectively. Saddle and unclassified points are reported in the message area. Also, information about possible (non)strict extrema is printed out. If {\tt lagrange} is passed as an optional last argument, the method of Lagrange multipliers is used. Else, the problem is reduced to an unconstrained one by applying implicit differentiation.

A single constraint/variable can be specified without list delimiters. A constraint may be specified as an equality or expression which is assumed to be equal to zero.

Number of constraints must be strictly less than number of variables. Additionally, denoting $ k $-th constraint by $ g_k(x_1,x_2,\dots,x_n)=0 $ for $ k=1,2,\dots,m $ and letting $ \mathbf{g}=(g_1,g_2,\dots,g_m) $, Jacobian matrix of $ \mathbf{g} $ has to be full rank (i.e.~equal to $ m $).

Variables may be specified with bounds, e.g.~{\tt x=a..b}, which is interpreted as $ x\in(a,b) $. For semi-bounded variables one can use {\tt -infinity} for $ a $ or {\tt +infinity} for $ b $. Also, parameter {\tt vars} may be entered as e.g.~{\tt [x1=a1,x2=a2,...,xn=an]}, in which case the critical point close to $ \mathbf{a}=(a_1,a_2,\dots,a_n) $ is computed numerically, applying an iterative method with initial point $ \mathbf{a} $.

If {\tt order\_size=<n>} is specified as the fourth argument, derivatives up to order $ n $ are inspected to find critical points and classify them. For {\tt order\_size=1} the function returns a single list containing all critical points found. The default is $ n=5 $. If some critical points are left unclassified one might consider repeating the process with larger value of $ n $, although the success is not guaranteed.

parisse
Messages : 5734
Inscription : mar. déc. 20, 2005 4:02 pm
Contact :

Re: nonlinear optimization functions

Message par parisse » mar. juil. 17, 2018 8:52 am

Thank you, I have updated the source!

parisse
Messages : 5734
Inscription : mar. déc. 20, 2005 4:02 pm
Contact :

Re: nonlinear optimization functions

Message par parisse » jeu. juil. 19, 2018 8:31 am

I had to add in optimization.cc:

Code : Tout sélectionner

typedef unsigned long ulong;
to compile with emscripten.

lukamar
Messages : 331
Inscription : ven. juin 30, 2017 9:55 am
Localisation : Zagreb, Croatia

Re: nonlinear optimization functions

Message par lukamar » jeu. juil. 19, 2018 4:19 pm

Indeed, the ulong type appeared only in the last version. I updated my source.

lukamar
Messages : 331
Inscription : ven. juin 30, 2017 9:55 am
Localisation : Zagreb, Croatia

Re: nonlinear optimization functions

Message par lukamar » mer. oct. 10, 2018 2:53 pm

Hello Bernard,
I made an update to optimization.cc/.h, there is a new command kernel_density (alias kde) which performs kernel density estimation (a similar command KernelDensity exists in Maple). As a reference to theoretical background and implementation tricks I used Artur Gramacki's text "Nonparametric Kernel Density Estimation and Its Computational Aspects". The algorithm can apply FFT to the computation of kernel sums, hence it is quite fast. It also includes a direct plug-in bandwidth selector which gives excellent results.
There are two new options that I had to add: bandwidth and bins (I enclosed the updated files dispatch.h and lexer_tab_int.h).
The documentation is in kde.tex (it can be appended to "Statistics/Density and distribution functions" section in cascmd_en.tex). Here I propose the short help:

Code : Tout sélectionner

# kernel_density kde
0 Lst(L),[options]
2 Estimates the probability density function from which the (presumably independent) samples in list L are drawn.
-1 random_variable
-2 sample
kernel_density([1,2,3,2],bandwidth=1/4,exact)
 X:=randvar(gammad,mean=5,stddev=2):; kernel_density(sample(X,500),bins=50)
 Y:=randvar(normal,mean=5,stddev=1.5):; plot(kernel_density(sample(Y,1000),bins=50,piecewise),x=0..10)

# bandwidth
0 Opt
2 Option for the kernel_density command.
-1 kernel_density
-2 bins

# bins
0 Opt
2 Option for the kernel_density command.
-1 kernel_density
-2 bandwidth
Luka
Pièces jointes
files.zip
(34.44 Kio) Téléchargé 115 fois

lukamar
Messages : 331
Inscription : ven. juin 30, 2017 9:55 am
Localisation : Zagreb, Croatia

Re: nonlinear optimization functions

Message par lukamar » mer. oct. 10, 2018 10:24 pm

I had to correct one of the examples in kde.tex, here's the updated file.
Pièces jointes
kde-doc.zip
(2.07 Kio) Téléchargé 108 fois

parisse
Messages : 5734
Inscription : mar. déc. 20, 2005 4:02 pm
Contact :

Re: nonlinear optimization functions

Message par parisse » jeu. oct. 11, 2018 11:49 am

You forgot optimization.cc/.h :-)
By the way, it would be easier to update for me (perhaps for you too) if you could put these files in your graph-theory repository (I know it's a little bit dirty but I'm not a computer scientist :-) )

lukamar
Messages : 331
Inscription : ven. juin 30, 2017 9:55 am
Localisation : Zagreb, Croatia

Re: nonlinear optimization functions

Message par lukamar » jeu. oct. 11, 2018 12:41 pm

I keep optimization.cc/.h in a separate repository here:
https://github.com/marohnicluka/optimization
Sorry, I forgot to mention it.
I can move files to graphtheory if that's easier for you.

Répondre