/usr/local/share/lua/5.1/nn/Linear.lua:99: invalid arguments: CudaTensor number CudaTensor CudaTensor expected arguments: *CudaTensor~2D* [CudaTensor~2D] [float] CudaTensor~2D CudaTensor~2D | *CudaTensor~2D* float [CudaTensor~2D] float CudaTensor~2D CudaTensor~2D stack traceback:
module:backward()and, tracing back, seems to imply that
gradInputis of the wrong type. I'm using the stock callback function (mostly). Can anyone suggest how to track this down? I'm trying to avoid spending a day diving into the dirty bits of where dp, dpnn, and nn intersect.
I am trying to run this example recurrentlanguagemodel.lua using the penn tree dataset
When I train it with cuda flag as false it runs perfectly file.
But on trying to run with cuda it gets a segmentation fault.
Following is my log
Any debugging helps would be great. How to go about solving this.