Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Tianqi Chen
    @tqchen
    hi guys
    @starimpact The people are more geo-distributed, so there could be delay in replying the im message.
    For proposals to changes, we can use github issues to propose one change at a time, which is a quite effective way as we seen so far, because it is actionable, either by the proposal submitter, or others who like to take a charge
    Gitter could be great for quick chat and trouble shootings
    Zhang Ming
    @starimpact
    hello guys
    anyone is there
    Zhang Ming
    @starimpact
    @tqchen i have the ergent problem , can you help me
    urgency
    Zheng Xu
    @XericZephyr
    What?
    Zhang Ming
    @starimpact
    can anyone help me to check #554 ?
    Zhang Ming
    @starimpact
    I don't know how to make a layer be a loss function.I know how to do with python, in which I can set init(False)
    help me.....
    Zhang Ming
    @starimpact
    hello guys
        tdata_out = F<mshadow_op::log>(tdata_in + MIN_NUM) * tlabel +
                    (1.0f - tlabel) * F<mshadow_op::log>(1.0f - tdata_in + MIN_NUM);
        tdata_out = tdata_out * tmarklabel * -1;
    cause warning:
    ./mshadow/mshadow/tensor.h: In function ‘void mshadow::MapExp(mshadow::TRValue<R, mshadow::gpu, dim, DType>*, const mshadow::expr::Exp<E, DType, etype>&) [with Saver = mshadow::sv::saveto; R = mshadow::Tensor<mshadow::gpu, 2, float>; int dim = 2; DType = float; E = mshadow::expr::BinaryMapExp<mshadow::op::minus, mshadow::expr::BinaryMapExp<mshadow::op::div, mshadow::Tensor<mshadow::gpu, 2, float>, mshadow::expr::BinaryMapExp<mshadow::op::plus, mshadow::Tensor<mshadow::gpu, 2, float>, mshadow::expr::ScalarExp<float>, float, 1>, float, 1>, mshadow::expr::BinaryMapExp<mshadow::op::div, mshadow::expr::BinaryMapExp<mshadow::op::minus, mshadow::expr::ScalarExp<float>, mshadow::Tensor<mshadow::gpu, 2, float>, float, 1>, mshadow::expr::BinaryMapExp<mshadow::op::plus, mshadow::expr::BinaryMapExp<mshadow::op::minus, mshadow::expr::ScalarExp<float>, mshadow::Tensor<mshadow::gpu, 2, float>, float, 1>, mshadow::expr::ScalarExp<float>, float, 1>, float, 1>, float, 1>; int etype = 1]’:
    ./mshadow/mshadow/tensor.h:67:1: warning: ‘shape2.mshadow::Shape<2>::shape_[1u]’ may be used uninitialized in this function [-Wmaybe-uninitialized]
           this->shape_[i] = s[i];
     ^
    ./mshadow/mshadow/././expr_engine-inl.h:361:13: note: ‘shape2’ was declared here
         Shape<dim> shape2 = ShapeCheck<dim, TB>::Check(t.rhs_);
                 ^
    what's the problem with the expression , how can I remove the warning?
    Zhang Ming
    @starimpact
    help me on issue #594
    i have finished the cpp verion
    but can not work on gpu
    cpu is ok
    @tqchen @winstywang @vchuravy
    Zhang Ming
    @starimpact
    @piiswrong
    😭
    😭😪😥😰😅😓😢😂
    Zhang Ming
    @starimpact
    😒😒😒😒
    Zhang Ming
    @starimpact
    @tqchen can you give some hints?
    Zhang Ming
    @starimpact
    anyone is there
    need help help
    Kornél
    @korneil
    hi! in a 4 class classification I get Validation-accuracy=0.250000 continuously, even after the 100th epoch. probably I prepare the label data wrong. is it a 1D array where the values are from 0 to 3 in case of a 4class classification, or am I doing something wrong?
    I am using SoftmaxOutput at the end of the network
    Zheng Xu
    @XericZephyr
    You will probably need to shuffle your training and testing data before that.
    The function of shuffle in ImageRecordIter is currently weak.
    So you can do shuffle the lines in the image list several times.
    Sheng Wang
    @shengwa
    Data shuffling may be one problem. You can also test different learning rates.
    Kornél
    @korneil
    data is shuffled. what is the range of learning rate you suggest to try out?
    sunsocool
    @sunsocool
    Hello Guys. I am founding a Deep Learning International WeChat group. I would like to invite you to join this WeChat group.
    The mission of this group is not calling for papers but calling for awesome products.
    I wrote the group member introduction in this post and the link is:
    https://www.linkedin.com/pulse/deep-learning-international-wechat-group-welcome-join-lin-sun
    My WeChat ID is sunsocool, Add my WeChat with message "MxNet" and I'll send you the group invitation.
    Thank you guys. Welcome to join the group!
    Qingsong Liu
    @pineking
    @sunsocool
    Tong Guo
    @guotong1988
    @sunsocool
    Deep Learning International WeChat Group,I add
    sunsocool
    @sunsocool
    @guotong1988 @pineking Welcome you guys join Deep Learning International WeChat group~ :clap:
    Parag K Mital
    @pkmital
    Hi I'm interested in using the pretrained 21k ImageNet model, though not sure how to get started. I have installed mxnet, and found the model here: https://github.com/dmlc/mxnet-model-gallery/tree/master/imagenet-21k-inception - though not sure how to load the parameters into a network for prediction?
    Parag K Mital
    @pkmital
    ok, nevermind, figured it out... first create a FeedForward net, then call load with the string 'Inception' and the epoch number...
    sunsocool
    @sunsocool
    @tw991 Welcome you join Deep Learning International WeChat group~ :clap:
    Tian Wang
    @tw991
    @sunsocool Deep Learning International WeChat Group. Thanks for inviting!
    Kornél
    @korneil
    i wrote a new operator for test purposes. how can I include in the build?
    oh I see Makefile reads it
    Chris DuBois
    @doobwa
    What happened to mx.sym.Embedding? It's still used by https://github.com/dmlc/mxnet/blob/master/example/rnn/lstm.py, but I don't see it in v0.5.0.
    Zheng Xu
    @XericZephyr
    I think you need to git pull the newest mxnet source code and rebuild the entire project. Embedding seems to be a new operator in C++ implementation.
    sunsocool
    @sunsocool
    @XericZephyr Welcome you join Deep Learning International WeChat group~ :clap:
    baoqp
    @baoqingping
    @sunsocool Deep Learning International WeChat Group. Thanks for inviting!