hey @cscorley sorry for the slow reply ... Optunity internally uses continuous optimizers, so at the moment the only option is to cast to int as you've been doing
adding explicit support for integers is on the to-do list currently, but admittedly not very high
if you have use cases where casting to int() is problematic for some reason we would be interested to hear about those!
OK! good to know.
the only foreseeable case i have is possibly wasting an evaluation on an integer that's previously been tried
Yeah, that's a good point. I'm playing with the idea of specifying an 'equivalence distance' (could be a callable) for each hyperparameter that would prevent that stuff from happening. This would probably be a fairly clean fix, and solves a variety of requests from our users in one go.
It should be straight-forward to support passage. What parameters would makes sense to tune there? The size of the recurrent layer?
Hi there. How to tune (by using optunity.maximize_structured with search space) a sklearn ML model without any cross validation on specific x_train and x_test data (which means I should be able to determine which data set is for training and which one for testing) ? I would appreciate a simple and clear example. Thanks
hi..i would like to ask if matlab wrapper supports python's aggregator "mean_and_list"? I cannot find a way to return the individual fold errors for further statistical processing (e.g. confidence interv etc.) [except maybe hacking global variables inside objective function or writing to files which is not very nice or fast;)]
@directorscut82 unfortunately, the matlab wrapper does not support the mean_and_list aggregator at this point
@Arch111 you can do this by making your own objective function, e.g. x_train = foo x_test = bar def create_objfun(your list of hyperpars): train with x_train predict with x_test return score
@claesenm thanks for your answer (at least now i am sure there isnt anything else to try)
@directorscut82 note, though, that the aggregation of scores is done on the MATLAB side of things, so the necessary modifications in Optunity's MATLAB code are fairly minimal
hello again.. is it possible to do stratisfied cross validation in matlab (similar proportions of samples from each class in every fold) ... i know there is strata but if i understand correctly you have to explicitly state the indices
also, i believe python's strata_by_labels does this, but i cannot find it in matlab
also x2;) is strata = [[all_indices_class_1], [all_indices_class_2],....,....,[all_indices_class_N]] the 'manual' way for str.sampling in cross_validate ?
@claesenm Hey thanks marc for the answer!
I have a question regarding how maximize or maximize_structured methods work and converge. Every time i run Maximize_structured with the same constraints and conditions, the method seem to return different optimized parameters
What should be done to make it return the same optimized parameters (or at least close to the global optimum)? Because i observed that the returned "optimized" parameter sets tend to be dramatically different from each other which means they are not close to global optimum. Thanks
How does optunity compare with other tools like hyperopt or spearmint? Any opinions or experiences ?
@Arch111 , I have no experience but from all the 3 you mentioned it seems that Optunity is the only one which supports MATLAB. So if that makes a difference, there you go...
@claesenm Hey Marc!
@claesenm are you here?
I have a function which takes a numpy array as parameter, can I optimize it using Optunity?