Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Soumith Chintala
    @soumith
    whenever variable X comes onto the stack, it prints its value
    just guessing here...
    Alfredo Canziani
    @Atcold
    OK, so I've found out how to use the watch (from RemDebug Example of Execution). Still, I've to digg more about
    reload                -- restarts the current debugging session
    stack                 -- reports stack trace
    output stdout <d|c|r> -- capture and redirect io stream (default|copy|redirect)
    I've open an issue pkulchenko/MobDebug#15, so perhaps I'll get some info from the developer :grimacing:
    Siavash Sakhavi
    @ssakhavi
    @Atcold Thanks for the topic Alfredo. But I have a question: Debugging is important, but why not use ZeroBrane and go towards manual debugging?
    I'm guessing that a developer must have a good reason for doing so.
    For myself, I got Zerobrane to work just Today (after multiple fails in getting it to work) and I am satisfied
    Alfredo Canziani
    @Atcold
    @ssakhavi, that's because I debug code on remote machines. No GUI in my workflow. And the debugger gets installed by typing solely luarocks install mobdebug, and boom, everything works just fine from the command line.
    What do you mean by manual debugging? I debug my code "manually" as well...
    Siavash Sakhavi
    @ssakhavi
    I meant using a GUI.
    I know where you're coming from. I also tried mobdebug and worked with it a bit. I didn't find any superiority over ZeroBrane. It would be nice if u could show an application that ZeroBrane can't handle
    Alfredo Canziani
    @Atcold
    Ehm… @ssakhavi, I don’t use the GUI because I ssh into several servers.
    I’ve been trying as well the GUI, some time ago, and it was crashing quite often.
    I agree that ZeroBrane is “superior” in terms of usability, but MobDebug is way smaller and faster to set up.
    So, now I’m getting familiar with MobDebug, and I will write my notes in the repo for others to use. This is how the articles come to life: they are notes of useful stuff I don’t want to forget.
    I didn’t find ZeroBrane that indispensable. MobDebug is becoming a good friend, instead.
    @soumith, I’ve never written a nn module myself, so I don’t have notes about it. Any suggestion about “things” one should take in account?
    Soumith Chintala
    @soumith
    @Atcold other than the standard constructor, updateOutput, gradInput, accGradParameters (if the module has parameters), one should be made to understand that:
    • every module comes with a self.output and self.gradInput, mostly they are tensors, but sometimes they can be tables as well (for example nn.ParallelTable). When they are tensors, they are initially 0 size. they have to be resized appropriately when used.
    • it should be discussed how :type() works and what it means.
    • the common and useful pattern of input.new() or input.nn.SpatialConvolution_updateOutput(...) where input.xxx is used as a method dispatcher according to the type of incoming tensor should be explained.
    • Also, a user has to be made aware that the first input is not the only tensor size that one will see. A module can have as input 12x10x10 in one :forward/:backward pass, and then in the next iteration, be sent 15x5x5.
    • And lastly, in terms of performance, a user should be made aware that creating reusable buffers and keeping them in self is important (like self.buffer) rather than creating new memory every time.
    Alfredo Canziani
    @Atcold
    Wow, @soumith, I have plenty of stuff to study here before having a decent grasp for writing something meaningful. I'm thinking about what I might use as an example which could showcase all the points you mentioned.. (Moreover, I guess I should explain the "old why" of embedding C stuff with the method dispatcher..)
    Soumith Chintala
    @soumith
    @Atcold you can use nn.Threshold which is super simple and goes to C, and you can use nn.Linear which is pure Lua
    Alfredo Canziani
    @Atcold
    @soumith, OK, I’ll look into it after the debugger chapter. Thank you.
    Siavash Sakhavi
    @ssakhavi
    I'm guessing you should consider several kinds of modules: Modules that are afffected by backprop, modules that act only as a function, modules that change for different modes and modules that can be GPU accelerated. Each of these can also be written in Lua or C .
    Alfredo Canziani
    @Atcold
    @ssakhavi, you're right! You could set up a tutorial skeleton, and then we could help you filling it up!
    Feel free to open a pull request, and start pushing commits!
    Alfredo Canziani
    @Atcold
    @ssakhavi, have a look if you can follow along the new tutorial. You can provide me feedback on here (if you spot typos, it's just easier to let me fix them before I push to master. No need to PR).
    Siavash Sakhavi
    @ssakhavi
    Sure
    Siavash Sakhavi
    @ssakhavi
    @Atcold , one of the things you have missed in the tutorial is that you forgot to mention to open another terminal to "listen" to the code during the debug process.
    Siavash Sakhavi
    @ssakhavi
    So what should be done is:
    • Open a Torch REPL using th and execute require('mobdebug').listen()
    • Run the desired code (here test.lua) in another terminal using th test.lua
    • The debugger has started and type help to use the functionality
    Typing help is already suggested by the debugger, so it's redundant.
    But thank you, I'll recall the fact that you need to debugger up and running before issuing th test.lua.
    Siavash Sakhavi
    @ssakhavi

    Sorry, I didn't see the recent update.

    Thanks

    Alfredo Canziani
    @Atcold
    OK
    I have actually to take the time to finish this chapter!
    Siavash Sakhavi
    @ssakhavi
    I didn't get what you said
    Alfredo Canziani
    @Atcold
    Where about?
    Siavash Sakhavi
    @ssakhavi
    "I have actually to take the time to finish this chapter!"
    Alfredo Canziani
    @Atcold
    That's not complete.
    And I need to commit the last part
    Siavash Sakhavi
    @ssakhavi
    OK
    Alfredo Canziani
    @Atcold
    Debugger session completed.
    Siavash Sakhavi
    @ssakhavi
    Excellent! Will look at it tomorrow
    Alfredo Canziani
    @Atcold
    Thank you @ssakhavi! I've to decide what to write next. One likely topic would be the "custom module" for nn
    Siavash Sakhavi
    @ssakhavi
    @Atcold I went through the tutorial. Everything went smoothly. Thanks! If you want you can assign something regarding the next tutorial to me so you don't carry all the load.
    Alfredo Canziani
    @Atcold
    Alright @ssakhavi. I'll do it later, this week.
    Alfredo Canziani
    @Atcold
    Would it make sense to write a chapter about Torch's memory mapping capabilities?
    Alfredo Canziani
    @Atcold
    I have half chapter already written (I posted this as a stand alone article on the forum, some time ago).
    Soumith Chintala
    @soumith
    I dont know much about Torch's memory mapping capabilities :) so looking forward to what you ahve.
    wbqtac
    @wbqtac
    Hi anyone met with cublas runtime error?
    I got a cublas runtime error, library not initialized. Thrown at nn.linear line 66
    Thank you in advance