Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Jaak Simm
    It should be straight-forward to support passage. What parameters would makes sense to tune there? The size of the recurrent layer?
    Hi there. How to tune (by using optunity.maximize_structured with search space) a sklearn ML model without any cross validation on specific x_train and x_test data (which means I should be able to determine which data set is for training and which one for testing) ? I would appreciate a simple and clear example.
    hi..i would like to ask if matlab wrapper supports python's aggregator "mean_and_list"? I cannot find a way to return the individual fold errors for further statistical processing (e.g. confidence interv etc.) [except maybe hacking global variables inside objective function or writing to files which is not very nice or fast;)]
    Marc Claesen
    @directorscut82 unfortunately, the matlab wrapper does not support the mean_and_list aggregator at this point
    @Arch111 you can do this by making your own objective function, e.g.
    x_train = foo
    x_test = bar
    def create_objfun(your list of hyperpars):
    train with x_train
    predict with x_test
    return score
    @claesenm thanks for your answer (at least now i am sure there isnt anything else to try)
    Marc Claesen
    @directorscut82 note, though, that the aggregation of scores is done on the MATLAB side of things, so the necessary modifications in Optunity's MATLAB code are fairly minimal
    hello again.. is it possible to do stratisfied cross validation in matlab (similar proportions of samples from each class in every fold) ... i know there is strata but if i understand correctly you have to explicitly state the indices
    also, i believe python's strata_by_labels does this, but i cannot find it in matlab
    also x2;) is strata = [[all_indices_class_1], [all_indices_class_2],....,....,[all_indices_class_N]] the 'manual' way for str.sampling in cross_validate ?
    @claesenm Hey thanks marc for the answer!
    I have a question regarding how maximize or maximize_structured methods work and converge. Every time i run Maximize_structured with the same constraints and conditions, the method seem to return different optimized parameters
    What should be done to make it return the same optimized parameters (or at least close to the global optimum)? Because i observed that the returned "optimized" parameter sets tend to be dramatically different from each other which means they are not close to global optimum. Thanks
    How does optunity compare with other tools like hyperopt or spearmint? Any opinions or experiences ?
    @Arch111 , I have no experience but from all the 3 you mentioned it seems that Optunity is the only one which supports MATLAB. So if that makes a difference, there you go...
    @claesenm Hey Marc!
    @claesenm are you here?
    I have a function which takes a numpy array as parameter, can I optimize it using Optunity?
    It looks like this:
    def target(x):
    return np.sum(xx - np.cos(2math.pi*x)) + np.prod(x.shape)
    it's a test function for global optimization
    I don't want to fix the dimension because I will test it with different dimension numbers and I would need to change many lines in my code
    Does optunity supports custom sklearn regrressors?
    ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all(), I am getting this error while optunity with SVM, my code is as shown below:

    import optunity
    import optunity.metrics
    import sklearn.svm

    score function: twice iterated 10-fold cross-validated accuracy

    @optunity.cross_validated(x=X_train, y=y_train, num_folds=10, num_iter=2)
    def svm_auc(x_train, y_train, x_test, y_test, logC, logGamma):
    model = sklearn.svm.SVC(C=10 logC, gamma=10 logGamma).fit(x_train, y_train)
    decision_values = model.decision_function(x_test)
    return optunity.metrics.roc_auc(y_test, decision_values)

    perform tuning

    hps, , = optunity.maximize(svm_auc, num_evals=200, logC=[-5, 2], logGamma=[-5, 1]).any()

    train model on the full training set with tuned hyperparameters

    optimal_model = sklearn.svm.SVC(C=10 hps['logC'], gamma=10 hps['logGamma']).fit(X_train, y_train)

    Could you please let me know why I am getting that error and how could I modify the code.