Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    directorscut82
    @directorscut82
    hi..i would like to ask if matlab wrapper supports python's aggregator "mean_and_list"? I cannot find a way to return the individual fold errors for further statistical processing (e.g. confidence interv etc.) [except maybe hacking global variables inside objective function or writing to files which is not very nice or fast;)]
    Marc Claesen
    @claesenm
    @directorscut82 unfortunately, the matlab wrapper does not support the mean_and_list aggregator at this point
    @Arch111 you can do this by making your own objective function, e.g.
    x_train = foo
    x_test = bar
    def create_objfun(your list of hyperpars):
    train with x_train
    predict with x_test
    return score
    directorscut82
    @directorscut82
    @claesenm thanks for your answer (at least now i am sure there isnt anything else to try)
    Marc Claesen
    @claesenm
    @directorscut82 note, though, that the aggregation of scores is done on the MATLAB side of things, so the necessary modifications in Optunity's MATLAB code are fairly minimal
    directorscut82
    @directorscut82
    hello again.. is it possible to do stratisfied cross validation in matlab (similar proportions of samples from each class in every fold) ... i know there is strata but if i understand correctly you have to explicitly state the indices
    directorscut82
    @directorscut82
    also, i believe python's strata_by_labels does this, but i cannot find it in matlab
    also x2;) is strata = [[all_indices_class_1], [all_indices_class_2],....,....,[all_indices_class_N]] the 'manual' way for str.sampling in cross_validate ?
    Arch111
    @Arch111
    @claesenm Hey thanks marc for the answer!
    I have a question regarding how maximize or maximize_structured methods work and converge. Every time i run Maximize_structured with the same constraints and conditions, the method seem to return different optimized parameters
    Arch111
    @Arch111
    What should be done to make it return the same optimized parameters (or at least close to the global optimum)? Because i observed that the returned "optimized" parameter sets tend to be dramatically different from each other which means they are not close to global optimum. Thanks
    Arch111
    @Arch111
    How does optunity compare with other tools like hyperopt or spearmint? Any opinions or experiences ?
    Thanks
    Royi
    @RoyiAvital
    @Arch111 , I have no experience but from all the 3 you mentioned it seems that Optunity is the only one which supports MATLAB. So if that makes a difference, there you go...
    chaine09
    @chaine09
    @claesenm Hey Marc!
    @claesenm are you here?
    abrock
    @abrock
    hi
    I have a function which takes a numpy array as parameter, can I optimize it using Optunity?
    It looks like this:
    def target(x):
    return np.sum(xx - np.cos(2math.pi*x)) + np.prod(x.shape)
    it's a test function for global optimization
    I don't want to fix the dimension because I will test it with different dimension numbers and I would need to change many lines in my code
    arjn95
    @arjn95
    Hi
    Does optunity supports custom sklearn regrressors?
    bikkinarohith
    @bikkinarohith
    Hello
    ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all(), I am getting this error while optunity with SVM, my code is as shown below:

    import optunity
    import optunity.metrics
    import sklearn.svm

    score function: twice iterated 10-fold cross-validated accuracy

    @optunity.cross_validated(x=X_train, y=y_train, num_folds=10, num_iter=2)
    def svm_auc(x_train, y_train, x_test, y_test, logC, logGamma):
    model = sklearn.svm.SVC(C=10 logC, gamma=10 logGamma).fit(x_train, y_train)
    decision_values = model.decision_function(x_test)
    return optunity.metrics.roc_auc(y_test, decision_values)

    perform tuning

    hps, , = optunity.maximize(svm_auc, num_evals=200, logC=[-5, 2], logGamma=[-5, 1]).any()

    train model on the full training set with tuned hyperparameters

    optimal_model = sklearn.svm.SVC(C=10 hps['logC'], gamma=10 hps['logGamma']).fit(X_train, y_train)

    Could you please let me know why I am getting that error and how could I modify the code.