by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Shunta Saito
    @mitmul
    -sameroom portal
    Sameroom
    @sameroom-bot
    <Sameroom> Your Portal URL is https://sameroom.io/jDAgNHmA -- you can send the URL to someone on a different team to share this room. Note: you can connect more than two teams this way.
    <Sameroom> I've connected 1 new room #text (chainer) on Slack. See map
    Sameroom
    @sameroom-bot
    [Kurt, chainer] hi, i'm trying to run the seq2seq example here: https://docs.chainer.org/en/stable/examples/seq2seq.html but I get an error:
    [Kurt, chainer] python3 seq2seq.py --gpu=0 ../../dataset/wmt/giga-fren.preprocess.en ../../dataset/wmt/giga-fren.preprocess.fr ../../dataset/wmt/vocab.en ../../dataset/wmt/vocab.fr --validation-source ../../dataset/wmt/newstest2013.preprocess.en --validation-target ../../dataset/wmt/newstest2013.preprocess.fr
    Traceback (most recent call last):
    File "seq2seq.py", line 111, in <module>
    @chainer.dataset.converter()
    AttributeError: module 'chainer.dataset' has no attribute 'converter'
    [Kurt, chainer] Does anyone know how to resolve this?
    [Kurt, chainer] Thank you
    Sameroom
    @sameroom-bot
    [nishino, chainer] Looks like you're running the seq2seq example from master branch. When you're using stable version of Chainer, please run examples from the stable branch.
    https://github.com/chainer/chainer/blob/v5/examples/seq2seq/seq2seq.py
    Or alternatively, you can use beta version of Chainer.
    Sameroom
    @sameroom-bot
    [Kurt, chainer] ha, indeed @nishino, thank you very much! It's now training:)
    Sameroom
    @sameroom-bot
    [Kurt, chainer] hi, I've got the latest stable version of Chainer and cupy but I still got this error. I posted my issue here: cupy/cupy#1912 Does anyone know how to resolve this? Cheers
    Sameroom
    @sameroom-bot
    [Kurt, chainer] hi all, a basic chainer question. My understanding is that embedID link in chainer automatically fine tunes the word embedding weights W because it is under a link, right? What if I don't want to update the word embeddings? Do you know how to disable it? There's a disable_update() method here but I'm not sure? https://docs.chainer.org/en/stable/reference/generated/chainer.links.EmbedID.html
    Sameroom
    @sameroom-bot
    [Masaki Kozuki, chainer] Yes, that’s right. disable_update will do what you want.
    Sameroom
    @sameroom-bot
    [Juan C., chainer] hi! I'm trying to build a network that takes a 4-dimensional numpy array of floats as an input. Is it any Dataset implementation that I can use?