High level packages for neural networks and optimization have greatly simplified the model development process. However, with these packages now becoming the "de facto" methods for approaching the task, the older methods are becoming something of a lost art. That is a bit of a problem because there are still many relevant cases where the classic approaches just seem to work better.
Earlier in the year I encountered a bug in the SciPy optimization routine 'trust-constr'. This is the method most analogous to Matlab's `fmincon` that supports optimization with arbitrary (linear or nonlinear) constraints. Optimizing a function subject to arbitrary constraints comes up frequently in data science tasks. This post details the simple fix that I made within the SciPy source code (that will hopefully be merged soon) as well as workarounds that can be used in the meantime.
30 Dec 2020 » Hello, world!
A brief statement of purpose