Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • May 05 2021 16:41
    guillaumekln transferred #580
  • May 05 2021 16:22
    hangcao1004 opened #580
  • Apr 16 2021 02:57
    raymondhs closed #400
  • May 29 2020 07:14
    guillaumekln closed #579
  • May 29 2020 07:14
    guillaumekln commented #579
  • May 28 2020 20:37
    Roweida-Mohammed edited #579
  • May 28 2020 20:36
    Roweida-Mohammed opened #579
  • Feb 19 2020 16:08
    codecov-io commented #578
  • Feb 19 2020 16:08

    guillaumekln on master

    Updating the intel-mkl URL. (#5… (compare)

  • Feb 19 2020 16:08
    guillaumekln closed #578
  • Feb 19 2020 16:08
    guillaumekln commented #578
  • Feb 19 2020 15:59
    arturgontijo opened #578
  • Feb 12 2020 17:58
    melindaloubser1 closed #553
  • Feb 12 2020 17:58
    melindaloubser1 commented #553
  • Dec 13 2019 08:43
    guillaumekln transferred #574
  • Dec 13 2019 08:43
    guillaumekln transferred #577
  • Dec 13 2019 08:14
    tkngoutham edited #577
  • Dec 13 2019 08:13
    tkngoutham opened #577
  • Dec 13 2019 06:37
    tkngoutham commented #574
  • Oct 09 2019 11:36

    guillaumekln on master

    Add CTranslate2 Change project cards title (compare)

Yuntian
@da03
@saurabhvyas Are you using LuaTorch version or PyTorch version? If it's PyTorch version then no -tgt needs to be provided to translate.py; if it was LuaTorch you can provide dummy ground truth labels such as 'aaa'
stribizhev
@stribizhev
Hi guys, any idea why I am getting THCudaCheck FAIL file=/tmp/luarocks_cunn-scm-1-1394/cunn/lib/THCUNN/generic/SoftMax.cu line=72 error=48 : no kernel image is available for execution on the device /torch/install/bin/luajit: /torch/install/share/lua/5.1/nn/THNN.lua:110: cuda runtime error (48) : no kernel image is available for execution on the device at /tmp/luarocks_cunn-scm-1-1394/cunn/lib/THCUNN/generic/SoftMax.cu:72 after [07/06/18 11:09:23 INFO] Preparing memory optimization...? I run training on a Tesla V100 having 4 GPU cores and can't work around this issue. I run training using th ./train.lua ... -gpuid 1 2 3 4.
stribizhev
@stribizhev
Ok, reinstalled Torch, all Luarocks, reboot, tried to install nccl 2.2, udated nn and cunn, and it now runs with CUDA_VISIBLE_DEVICES=0,1,2,3 th ./train.lua ... gpuid 1 2 3 4, but for the life of me, it still says there is no nccl though I installed it from NVIDIA site, both Ubuntu 16.04 normal and S agonstic versions (for CUDA 9.0, though I have CUDA 9.1 installed - but there is no NCCL 9.1 :().
Saurabh Vyas
@saurabhvyas
@da03 thanks for your reply, sorry for late response, it was for lua, and with dummy data is worked nicely, however I am trying pytorch version now, and one tensorflow implementation ( https://github.com/guillaumegenthial/im2latex/ ), in tf version, , I trained with custom font rendering and size augmentation on same formulas.lst provided by im2latex dataset, even though the perplexity training reached 1.07 , and validation perplexity 1.15 ,, almost 80% of the formulas are not exactly matching in test set, should I play with hyperparamters,?
Sirogha
@Sirogha
hi. how i can know - which version cuda used by opennmt?
Guillaume Klein
@guillaumekln
@Sirogha Do you mean the CUDA version that is required by OpenNMT?
M Saiful Bari
@sbmaruf
@guillaumekln hi there...!
In case if two language doesn't share the characters then how do we do the translation on a Name. Like "Rahul" (a male name) is english, should be "راهول" in Arabic, "राहुल" in Hindi.
So how actually <unk> tag replace work in that case.
If there is any related reference paper you can cite that would be great also. And also does OpenNMT supports this?
Guillaume Klein
@guillaumekln
Hi! In OpenNMT-Lua, there is the -phrase_table that can be used for this: for any <unk> token in the target, the corresponding source token is looked up in the table to find a translation. Other approaches include splitting names on characters and let the model learn the translation or, more commonly, replacing entities with placeholder tokens and have a separate processing software to replace these placeholders.
M Saiful Bari
@sbmaruf
Is it also available in openNMT-py/tf?
I don't see any -phrase_table tag for OpenNMT-py
Guillaume Klein
@guillaumekln
Yes it's only in OpenNMT-lua unfortunately but it's not complicated to add in OpenNMT-py. This feature gets less attention nowadays because Transformer models are incompatible with it as it requires alignment information from the model.
M Saiful Bari
@sbmaruf
So if I'm using transformer model there's no way I can solve this?
Guillaume Klein
@guillaumekln
You can try to split them on characters if you believe there are enough examples in the training data for the model to learn something. Otherwise, this requires an external pre- and post- processing.
M Saiful Bari
@sbmaruf
I was thinking about doing some kind of Transliteration/Romanization of the target language to the source language.
Otherwise, this requires an external pre- and post- processing.
what are the pre/post processing? Any resource out there?
Apart from that if I see(in the following link) google translate, see the red marked text here. Is this the pre/post processing steps here?
https://drive.google.com/file/d/1POkH3XE6yfCoNkJn4DyMRP6J11Ecyqjl/view?usp=sharing
Guillaume Klein
@guillaumekln

what are the pre/post processing?

There are several ways to achieve this and I don't know you current workflow for NMT so it's difficult for me to answer.

M Saiful Bari
@sbmaruf
@guillaumekln Thank you for your answer. I will think and work on your points. If I came to any specific problem then I will ask you again. Actually right now I'm exploring NMT. I wanted to see how to handle the different issues of NMT. This is a pedagogical exploration. Right now I don't have any specific workflow. My research goal is to do kind of Discourse exploration in NMT. I just started.
lagleki
@lagleki
does opennmt in tf support attention mechanism?
Jean Senellart
@jsenellart
yes of course
Sirogha
@Sirogha
Hello. If i try to translate some text with -replace_unktagged and -tok{src,tgt}_case_feature, so i get result with ⦅unk:xxxxx⦆but in only lowercase. How i can find case type for unk?
sergei-from-vironit
@sergei-from-vironit
Hi. I started train data after sentencepiece bpe mode tokenizer. And get error in 4 epoch
[02/08/19 09:43:54 INFO] Epoch 4 ; Iteration 98300/103388 ; Optim SGD LR 1.000000 ; Source tokens/s 8042 ; Perplexity 10.81
/root/torch/install/bin/luajit: bad argument #2 to '?' (out of range at /root/torch/pkg/torch/generic/Tensor.c:913)
stack traceback:
[C]: at 0x7f65146d1580
[C]: in function '__index'
./onmt/train/Trainer.lua:234: in function 'getBatchIdx'
./onmt/train/Trainer.lua:276: in function 'trainEpoch'
./onmt/train/Trainer.lua:484: in function 'train'
train.lua:333: in function 'main'
train.lua:338: in main chunk
[C]: in function 'dofile'
/root/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk
[C]: at 0x00405d50
root@Ubuntu-1604-xenial-64-minimal:~/OpenNMT
What can be do it?
anaskoara
@anaskoara
most NMT models use byte pair encoding for to solve unknow words can anyone explain why it is useful?
@sergei-from-vironit most NMT models use byte pair encoding for to solve unknow words can you explain why it is useful?
_
sahinurlaskar
@sahinurlaskar
Why features are calculated after splitting based on pipe, what if the data does not contain "|"
geekygirl123
@geekygirl123
I have multiple questions regarding copy mechanism implemented in Opennmt toolkit.
1) "Since we are using copy-attention [1] in the model, we need to preprocess the dataset such that source and target are aligned and use the same dictionary. " .This is a line from the readme . What does alignment between source and target words mean?
2) Has anyone tried copy mechanism with transformer architecture? If yes how was the performance?
Guillaume Klein
@guillaumekln
You probably want to post these questions in https://gitter.im/OpenNMT/OpenNMT-py or in the forum https://forum.opennmt.net/
woo chin
@chinwoo987_gitlab
I wonder maybe I should try a different BPE model other the aggressive one? I will try to update more often at https://gitter.im/OpenNMT/OpenNMT-py https://wordcounter.tools
Karen Lastmann Assaraf
@karen.lastmannassaraf_gitlab
Hi!
Did someone already converted an opennmt self attention encoder-decoder to a hugging face transformer?
Thanks!