See discussions on Fortran Discourse:<p><a href="https://fortran-lang.discourse.group/t/prima-has-got-a-python-interface/7942" rel="nofollow">https://fortran-lang.discourse.group/t/prima-has-got-a-pytho...</a>
PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's renowned derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. The "P" in the name stands for Powell, and "RIMA" is an acronym for "Reference Implementation with Modernization and Amelioration".<p>Thanks to the huge efforts of Nickolai Belakovski, PRIMA now has an official Python interface. It talks to Python via pybind11 and its C API instead of using F2PY.<p>I hope PRIMA will provide an example of binding modern Fortran libraries with Python.<p>Concerning Python, the next steps of PRIMA will be<p>- making PRIMA available on PyPI;<p>- making PRIMA available on Conda;<p>- getting PRIMA into SciPy (see <a href="https://github.com/libprima/prima/issues/112">https://github.com/libprima/prima/issues/112</a> ).<p>PRIMA is part of a research project funded by the Hong Kong Research Grants Council and the Department of Applied Mathematics (AMA) at the Hong Kong Polytechnic University (PolyU). The current version is ready to be used in Fortran, in C, in Python, in MATLAB, and in Julia.<p>------<p>Who was Powell? (from <a href="https://github.com/libprima/prima#who-was-powell">https://github.com/libprima/prima#who-was-powell</a>)<p>Michael James David Powell FRS [1] was "a British numerical analyst who was among the pioneers of computational mathematics" [2]. He was the inventor/early contributor of quasi-Newton method [3], trust region method [4], augmented Lagrangian method [5], and SQP method [6]. Each of them is a pillar of modern numerical optimization. He also made significant contributions to approximation theory and methods [7].<p>Among numerous honors, Powell was one of the two recipients of the first Dantzig Prize [8] from the Mathematical Programming Society (MOS) and Society for Industrial and Applied Mathematics (SIAM). This is considered the highest award in optimization.<p>[1] <a href="https://en.wikipedia.org/wiki/Michael_J._D._Powell" rel="nofollow">https://en.wikipedia.org/wiki/Michael_J._D._Powell</a><p>[]<a href="https://royalsocietypublishing.org/doi/full/10.1098/rsbm.2017.0023" rel="nofollow">https://royalsocietypublishing.org/doi/full/10.1098/rsbm.201...</a><p>[3] <a href="https://en.wikipedia.org/wiki/Quasi-Newton_method" rel="nofollow">https://en.wikipedia.org/wiki/Quasi-Newton_method</a><p>[4] <a href="https://en.wikipedia.org/wiki/Trust_region" rel="nofollow">https://en.wikipedia.org/wiki/Trust_region</a><p>[5] <a href="https://en.wikipedia.org/wiki/Augmented_Lagrangian_method" rel="nofollow">https://en.wikipedia.org/wiki/Augmented_Lagrangian_method</a><p>[6] <a href="https://en.wikipedia.org/wiki/Sequential_quadratic_programming" rel="nofollow">https://en.wikipedia.org/wiki/Sequential_quadratic_programmi...</a><p>[7] <a href="https://www.cambridge.org/highereducation/books/approximation-theory-and-methods/66FD8CD6F18FE1ED499A8CA9A05F2A5A#overview" rel="nofollow">https://www.cambridge.org/highereducation/books/approximatio...</a><p>[8] <a href="https://en.wikipedia.org/wiki/Dantzig_Prize" rel="nofollow">https://en.wikipedia.org/wiki/Dantzig_Prize</a>