Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Matteo Interlandi
    @interesaaat
    yep
    Natan Katz
    @natank1
    I will (in awhile) Rgearding GPU who can do it? ironpython? something with onnx? which one?
    Matteo Interlandi
    @interesaaat
    I am looking into it. if you want you can do it by yourself by downloading the GPU version of libtorch from the pytorch website and swap the code into the native redist directory
    Natan Katz
    @natank1
    Still does not work. (with CPU) I saved it with regular load and the code does nothing but var module = JIT.Module.Load(@"C:\Natan\anaconda3_reg_best.pt");
    Sorry var module = JIT.Module.Load(@"C:\Natan\anaconda3\_reg_best.pt");
    Natan Katz
    @natank1
    Maybe I have to save the model differntly?
    Natan Katz
    @natank1
    What do I do wrong?
    Matteo Interlandi
    @interesaaat
    Is the model a TorchScript model? If is a pickle model I am not sure it will work.
    YanYas
    @YanYas
    Hi @interesaaat, it may be a little early to ask but do you think the MLNet DataLoaders become a core part of the TorchSharp package or be a separate redistributable? I ask because It may seem strange to say but I would rather MLNET was an option rather than a requirement for TorchSharp.
    As a CNTK refugee, I accept that their is definitely a need for data loading, and CNTK’s Python strand made it possible to create custom deserializers. In my little weak brain, some sort of custom data-loader-cum-batch-snatcher interface for TorchSharp would give everyone all the good things, and that should include MLNet DataLoaders and whatever data types (multi-channel audio samples, point-cloud frames, etc) the future has in store
    (Big ask though)
    Matteo Interlandi
    @interesaaat
    Hi @YanYas, what I had in mind is to have the ML.NET loaders as nugets, and not strictly required. Like for the moment we have loaders for MNIST and CIFAR and I believe those will stay there. Nevertheless, I don't want to get into the business of reinventing the data loaders wheel, so I am pushing for having ML.NET integration because of that.
    YanYas
    @YanYas
    Thanks @interesaaat, sounds good, and nice progress so far
    Michaela Klauck
    @MKlauck

    Hi,
    when using TorchSharp, we encountered a problem which looks like memory corruption (a TorchTensor appears to change its values to junk values). It looks like at least one instance was caused by initializing a tensor with an array which later went out of scope in C# and was freez by GC.
    Is this a bug, or is the user responsible for keeping the source array in scope?

    Making sure that the array is never freed by the garbage collection solved the problem at a first glance. But later we encountered a similar behavior where none of the data can be freed unintentionally. Nevertheless, at some point our tensor data is overwritten. Is there anything else we should take care of when using TorchSharp to avoid memory corruption? Did anybody encounter similar problems?

    Thank you very much for your help!

    Best,
    Michaela

    Matteo Interlandi
    @interesaaat
    Hi @MKlauck! Glad to hear you are using TorchSharp, and sorry for the delayed response time. To answer your questions: if you create the tensors from C# arrays yes, you have to manage the disposal of the tensors by yourself. Also, if you overwrite the data, the tensors data will be overwritten to unless you create a copy. (note that you don't have to do this if instead you create the tensors using torch methods like zeros, etc.) Regarding the overwriting problem: this might be either a bug or you are modifying the original data array. Were you able to reproduce it? If you think it is a bug, and you can reproduce it, could you please fill an issue on github? Thanks!
    Michaela Klauck
    @MKlauck
    Hi @interesaaat,
    Thank you very much for your response. Yes, this describes exactly what I observed. But even if I make sure that the arrays will not be overwritten, at some point the data is lost. I created a minimal working example which I will post as an issue on github as soon as the deadline for my project is over and I have a little bit more time. At the moment, I fixed the problem by implementing the NN functionality I need for the project by myself.
    YanYas
    @YanYas
    Hi, just wondering if there are any upcoming updates with this project. Its been quiet for a while
    Matteo Interlandi
    @interesaaat
    Hey @YanYas, I have some update I will try to push when I have some time. I pumped TorchSharp to PyTorch 1.3, and fixed few memory bugs. I need time to test the code; I will push the update afterwards.
    YanYas
    @YanYas
    Okay, thanks for letting me know, and good luck!
    pappataci
    @pappataci

    Hi all. Sorry for trivial question, but I am trying to run Torch.NET from FSharp interactive (Visual Studio 2019) and have failed. I got the following exception: Unable to load DLL 'python37': The specified module could not be found. I have added the fodler where the compiler is to the environment variables, have installed Conda (possibly useless). What else am I missing? Of course I have added references of Microsoft.CSharp.dll, Numpy.Bare.dll, Python.Runtime.dll, Torch.Net.dll.
    Is there a step-by-step guide on how to start with Torch.NET?

    Thanks lot for any help

    Matteo Interlandi
    @interesaaat
    Hi @YanYas this is TorchSharp, not Torch.NET! We don't use any python code here. Please check with the Torch.NET folkshttps://github.com/SciSharp/Torch.NET
    pappataci
    @pappataci

    @interesaaat Thanks for your response. Sorry I confused the two projects. I would rather prefer a managed library: so I am getting into TorchSharp. It seems functional so far from a console application; I am having troubles, though, in running from F# interactive. Specifically I got this exception:
    System.DllNotFoundException: Unable to load DLL 'LibTorchSharp': The specified module could not be found. (Exception from HRESULT: 0x8007007E)
    at TorchSharp.Torch.THSTorch_seed(Int64 seed)
    at <StartupCode$FSI_0018>.$FSI_0018.main@() in

    Has anyone had any luck with TorchSharp through F# interactive? THanks

    Kevin Malenfant
    @kevmal
    You need to add the path of LibTorchSharp to PATH or LD_LIBRARY_PATH. Use System.Environment.SetEnvironmentVariable and System.Environment.GetEnvironmentVariableto setup before using TorshSharp
    pappataci
    @pappataci
    @kevmal Worked; thanks a lot
    pappataci
    @pappataci

    What is the equivalent of torch.optim? Has it been implemented?

    Any recommendations on how to reproduce PyTorch code on TorchSharp in .NET? Thanks again

    Matteo Interlandi
    @interesaaat
    There are a couple of optimizations implemented, I believe SGD and Adam (sorry I am away from my laptop). Porting of code should be quite straightforward, our API is basically PyTorch API minus C# idiosyncrasies
    pappataci
    @pappataci
    @interesaaat Thanks a lot for your prompt response: I will look into that. Cheers.
    pappataci
    @pappataci
    Yes, you were right; TorchSharp.NN.Optimizer contains SGD and Adam
    solarflarefx
    @solarflarefx

    I am interested in passing data already in the GPU into a DL network with the output still residing in the GPU to be passed to the next element in a pipeline. If possible I’d like to do this without having to perform CPU/GPU copies. Is this possible at the moment? In the end it’s a bit tricky as it would require a conversion from a cuda data type to an torch tensor.

    I think this concept can work in Python using PyCuda and PyTorch: https://discuss.pytorch.org/t/interop-between-pycuda-and-pytorch/588

    However, I am not sure if this would currently work in C# using TorchSharp.

    Kevin Malenfant
    @kevmal
    @solarflarefx Could you allocate in a DL framework, do your existing cuda work, come back to the DL framework?
    pappataci
    @pappataci
    is there any tutorial on how to train a NN? Also on the GPU? I would like to implement a reinforcement learning algorithm using TorchSharp in F#: is it doable with the actual code? If yes, I will try to be more specific. Thanks
    Matteo Interlandi
    @interesaaat
    @pappataci there are no tutorials but we have AlexNet and MNIST examples here. So you could take a look at this and this tutorials for example, and use TorchSharp instead of PyTorch. Regarding the GPU, the code is there, but I didn't have the time to test it yet unfortunately.
    pappataci
    @pappataci
    @interesaaat Thanks. I will look into that.
    solarflarefx
    @solarflarefx

    Does Torchsharp have access to the following C++ function?: https://pytorch.org/cppdocs/api/function_namespacetorch_1ad7fb2a7759ef8c9443b489ddde494787.html

    I think this is ultimately what I am looking for to take data already in the GPU and use it as a tensor input for a DL model

    Don Syme
    @dsyme

    @interesaaat Did you ever push these fixes?

    Hey @YanYas, I have some update I will try to push when I have some time. I pumped TorchSharp to PyTorch 1.3, and fixed few memory bugs. I need time to test the code; I will push the update afterwards.

    Matteo Interlandi
    @interesaaat
    No sorry @dsyme I haven't pushed them yet. If you are playing with Torchsharp and need the fixes I will try to resume the work and push them soon-ish
    Don Syme
    @dsyme

    Hi @interesaaat,

    Yes, I'm playing with TorchSharp - my need is for a libtorch backend for DiffSharp (so no AD engine, just the ATEN bindings really)

    (If there's any other project giving these bindings I could use that instead)
    Looking at https://github.com/xamarin/TorchSharp/network it seems like @moloneymb has a relevant fix as well.
    Matteo Interlandi
    @interesaaat
    I am not aware of anything similar, but I haven't looked at the space lately. Ok, let me try to find some time in the coming weeks to push the fix.
    Don Syme
    @dsyme
    I'm doing some work to get TorchSharp up to the point that we can use it as a backend for DiffSharp, please see xamarin/TorchSharp#127
    Don Syme
    @dsyme
    lang shuang
    @xiaoshuangzi
    Hi, I see the Git Repo is updated and lots of things changed. When use the latest repo, I couldn't call the Forward function after load a model trained by pytorch, How should I do ? Thanks very much!
    lang shuang
    @xiaoshuangzi
    I wonder why torch.Jit were commented out, does this mean torch.Jit is not implemented yet or it is replaced by other function? Thank a lot !
    Miguel de Icaza
    @migueldeicaza
    Don Syme might know
    ^^
    Don Syme
    @dsyme
    Please add an issue in the repo? Thanks

    When use the latest repo, I couldn't call the Forward function after load a model trained by pytorch, How should I do ?

    Hmmm this should work. Loading should be more accurate now. Please do add an issue.

    I wonder why torch.Jit were commented out, does this mean torch.Jit is not implemented yet or it is replaced by other function? Thank a lot !

    There was a lot of change in the C++ API for torch.JIT

    We are looking at code generating more of the API.

    Please do add an issue in the repo about this.

    Matthew Moloney
    @moloneymb
    Hey Don, I'm not sure how much of the JIT API will get picked up by the code generation. The JIT stuff will probably be done by hand.