stefan-k on master
Removed extern crate definitionâ€¦ (compare)
param (best): [1.0724442427550391, 1.151531565672518]
?
initial position - step length * gradient
. This will be the new position at which it will again compute the gradient, cut the cost function, compute step length, take a step, ... and so on. it will repeat this iteratively until a stopping criterion is met. Nocedal & Wright have very good chapters on line searches and gradient descent and can explain this much better than I can
OpWrapper
( https://github.com/argmin-rs/argmin/blob/master/src/core/opwrapper.rs ) which has no tests yet. Apart from that, since you are a bit new to optimization, I would suggest to work through a book, maybe wright & nocedal, and during that think of possible tests for a particular solver. I think meaningful tests for the solvers can only be found when one understands the algorithm in detail.