Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Rishab Goel
    @RishabGoel
    I think a project might be good way to truly understand! is someone interested in doing one with me?
    Thomas Mundt
    @tmundt
    Sounds good.
    @vyraun Did you solve the problem with the big data? This example seems to be too huge to be processed by normal matters. Is not there out another one not that big?
    Vikas Raunak
    @vyraun
    @tmundt I haven't tried again after upgrading my VM. But I guess, other nmt tutorials must be there too.
    Ancas Horia
    @ancashoria
    hey, I have a problem when fitting my model using tflearn. It says: IndexError: list index out of range
    What shape should X and Y have for this to work ?
    Rohith Radhakrishnan
    @radrohit
    Hi guys, does any of you have a pretrained model (convolutional network). I am trying to visualize the hidden layers.
    Danijar Hafner
    @danijar
    I don't, but you can easily run the code from one of the official TF tutorials on conv nets.
    chotasanjiv
    @chotasanjiv
    Hi Guys,
    do you know any data set which contains flower images,
    i have created a custom set using google images but getting the following error when i run "tensorflow/tensorflow/examples/image_retraining/retrain.py" CRITICAL:tensorflow:Label inception has no images in the category training.
    kduy
    @kduy
    @chotasanjiv : it 's here
    James
    @JamesDaniel
    Hey, does anyone know any good rooms to learn the math behind neural networks. Thanks in advance
    gumnn
    @arita37
    Onuray Sahin
    @onurays

    Hey, is anyone working on im2txt model? What was the final training loss at the end of the training process (for 1M epoch)?

    For me it is iterating 61763 and training loss is 2.82 It was also 2.9x for the 35000th iteration. I would love to hear yours to see the ~difference.

    James
    @JamesDaniel
    Thanks @arita37
    Richard L. Burton III
    @rburton
    re
    Leonard Lee
    @leonardgithub
    Hi everone, I want to know how to classify my own images using tensorflow. Any idea?
    Xiaolin Zhang
    @leoncamel
    Hi, tf hackers. Is there any method that I can convert DT_FLOAT to DT_HALF/DF_BFLOAT16 in my graph? I am thinking of accerlate the performance for Forward path only.
    Pommier
    @jeanpat
    Hello, I've proposed a project for chromosomes segmentation : https://github.com/jeanpat/DeepFISH . Up to now there are datasets (82146 images+ground-truth labels) and python notebooks (way to generate datasets, scikit-image based). The notebooks could be better: the size of the datasets are limited by the amount of RAM. The datasets are waiting to train a classifier.
    Xiaolin Zhang
    @leoncamel
    @jeanpat what kind of GPU card did you use?
    Pommier
    @jeanpat

    A small nvdia gpu: lspci -k | grep -A 2 -E "(VGA|3D)"
    00:02.0 VGA compatible controller: Intel Corporation 4th Gen Core Processor Integrated Graphics Controller (rev 06)
    Subsystem: CLEVO/KAPOK Computer Device 5000

        Kernel driver in use: i915

    --
    01:00.0 3D controller: NVIDIA Corporation GK208M [GeForce GT 740M] (rev ff)
    Kernel modules: nouveau, nvidia_drm, nvidia
    03:00.0 Network controller: Intel Corporation Wireless 7260 (rev 73)

    @leoncamel up to now, I don't know if I can train a cnn with the proposed data.
    Jeff James
    @jrjames83
    blob
    @onurays - I am training as we speak, at global step 275k
    I'm hoping at step 1,000,000 it will do reasonably good captioning - currently running at 1.6 global steps per second on an ec2 p2.xlarge single GPU machine (90 cents / hr on demand)
    Jeff James
    @jrjames83
    Did yours finish? Let me know how it's working
    Winter
    @malidong
    hi tensorflow user from Japan.
    Shikhar Srivastava
    @soilad
    Hello !
    I am facing an issue with Deepdream using an inception-v3 net that was retrained on my own image dataset.
    The graph for the retrained inception-v3-net is quite different. And subsequently, I'm finding altering the Deepdream notebook (https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/tutorials/deepdream/deepdream.ipynb), quite difficult.
    Is it possible to use the retrained model forDeepdream, in the first place? :O
    And if so, could you suggest some changes to the deepdream notebook .
    Any help would be appreciated !
    LIN
    @magixsource
    Hi,this is newbie of Tensorflow
    Paweł Biernat
    @pwl
    Hi, I'm just starting with the tensorflow and I have a question. Say I have a vector x=[1,2,3] with a shape [3], how do I produce a matrix with columns coming from x, something like [x,x,x,x], with a shape [3,4]? Mathematically it could be described as the outer product of [1,2,3] and [1,1,1,1].
    Paweł Biernat
    @pwl
    If that helps, I'm trying to run Edward with w = Normal(mu=tf.zeros([D, K]), sigma=sigma0 * tf.ones([D, K])), and I would like to change tf.ones([D,K]) into the outer product matrix that I described.
    Paweł Biernat
    @pwl

    answering my own question

    x1,y1=tf.meshgrid(tf.constant([1,2,3,4]),tf.ones(2),indexing='ij')

    seems to do the job.

    James
    @JamesDaniel
    hey, anybody know why I can't install tensorflow in ubuntu?
    (virt4) user@ubuntu:~/virt_env/virt4$ pip install --upgrade https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.11.0-cp27-none-linux_x86_64.whl
    Collecting tensorflow==0.11.0 from https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.11.0-cp27-none-linux_x86_64.whl
    Downloading https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.11.0-cp27-none-linux_x86_64.whl (39.8MB)
    99% |████████████████████████████████| 39.8MB 40.4MB/s eta 0:00:01Exception:
    Traceback (most recent call last):
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/basecommand.py", line 215, in main
    status = self.run(options, args)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/commands/install.py", line 335, in run
    wb.build(autobuilding=True)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/wheel.py", line 749, in build
    self.requirement_set.prepare_files(self.finder)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/req/req_set.py", line 380, in prepare_files
    ignore_dependencies=self.ignore_dependencies))
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/req/req_set.py", line 620, in _prepare_file
    session=self.session, hashes=hashes)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/download.py", line 821, in unpack_url
    hashes=hashes
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/download.py", line 659, in unpack_http_url
    hashes)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/download.py", line 882, in _download_http_url
    _download_url(resp, link, content_file, hashes)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/download.py", line 605, in _download_url
    consume(downloaded_chunks)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/utils/init.py", line 852, in consume
    deque(iterator, maxlen=0)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/download.py", line 571, in written_chunks
    for chunk in chunks:
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/utils/ui.py", line 139, in iter
    for x in it:
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/download.py", line 560, in resp_read
    decode_content=False):
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/response.py", line 357, in stream
    data = self.read(amt=amt, decode_content=decode_content)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/response.py", line 314, in read
    data = self._fp.read(amt)
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/_vendor/cachecontrol/filewrapper.py", line 63, in read
    self._close()
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/_vendor/cachecontrol/filewrapper.py", line 50, in _close
    self.callback(self.buf.getvalue())
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/_vendor/cachecontrol/controller.py", line 275, in cache_response
    self.serializer.dumps(request, response, body=body),
    File "/home/user/virt_env/virt4/local/lib/python2.7/site-packages/pip/_vendor/cachecontrol/serialize.py", line 87, in dumps
    ).encode("utf8"),
    MemoryError
    ahh, sorry. I thought that would be collapsed
    James
    @JamesDaniel
    so I got as far as installing tensor globally even though I had the virtualenv active :/
    problem was I was using the latest ubuntu 16 but should have been using ubuntu 14
    64bit system of course
    but I'm having another problem :/
    numpy won't install and its asking for setuptools which is already installed
    Cleaning up...
    Command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip_build_root/numpy/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-5hYSAi-record/install-record.txt --single-version-externally-managed --compile failed with error code 1 in /tmp/pip_build_root/numpy
    Traceback (most recent call last):
      File "/usr/bin/pip", line 9, in <module>
        load_entry_point('pip==1.5.4', 'console_scripts', 'pip')()
      File "/usr/lib/python2.7/dist-packages/pip/__init__.py", line 235, in main
        return command.main(cmd_args)
      File "/usr/lib/python2.7/dist-packages/pip/basecommand.py", line 161, in main
        text = '\n'.join(complete_log)
    UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 72: ordinal not in range(128)
    YMOS
    @YMOS
    Hi guys
    I am in need of few ideas that can be implemented in deep learning using Tensor. Any help will be appreciated