|[ MINLP World Home | Board | Solvers | MINLPLib | Gams World Forum | Search | Contact ]|
Current revision: 342 Date: 2017-02-07
Since 2001, the Mixed-Integer Nonlinear Programming Library (MINLPLib) and the Nonlinear Programming Library (GLOBALLib) have provided algorithm developers with a large and varied set of both theoretical and practical test models. We have recently started with major updates to MINLPLib (now incorporating also NLPs), which includes the addition of many more instances. We hope that the updated library can be a starting point to define a widely accepted test set to evaluate the performance of MINLP solving software.
Note, that the development of this update to the MINLPLib currently in ALPHA phase. That is, while the library already provides more instances, more solution points, and more information on each instance than the first MINLPLib, it is still possible that instances are added, removed, or modified without notice.
See the GAMS User's Guide. However, MINLPLib 2 works around the default upper bound of 100 for integer variables by setting the option intvarup=0 (see, e.g., model jit1). Therefore, for integer variables a default lower bound of 0 and a default upper bound of +infinity is assumed.
No. New solutions are usually only added when improving the primal bound. If several solvers find the same solution, it is random which solver is attributed to that solution. Further, the reported dual bounds are just the best value as computed in some run with some option settings on some machine at some time in the past.
We are looking for more interesting and challenging (MI)NLPs from all fields of Operations Research and Combinatorial Optimization, ideally those which have been built to model real life problems.
If you would like to contribute, please send your instances by e-mail. We accept any well-known format that can be translated into GAMS. This includes AMPL (.mod and .nl), GAMS, ZIMPL, BARON, CPLEX LP, MPS, PIP, and OSiL.
Further, if you have a MINLP model that you would like to discuss with other people, be reminded of the minlp.org initiative. We are monitoring minlp.org and add model instantiations from minlp.org to MINLPLib.