Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Milan Oljaca
    @moljacq

    Announcement

    ONNX Edge WG has been retired as of Jan 08 2020.
    A big thank you to all contributors. All WG artifacts are captured at edge github location.
    Vinitra Swamy
    @vinitra

    Model Zoo + Tutorials SIG, Meeting #1

    Time: Jan 30, 2 - 2:30 PM PST
    https://zoom.us/j/818755381

    The Model Zoo + Tutorials ONNX SIG will be having its inaugural meeting next Thursday. Please join us if you are interested in discussing or contributing to related outreach efforts!

    Wei-Sheng Chin
    @wschin

    python loading onnx model as a protocol buffer,

    one of the objects you getting while loading onnx model is a GRAPH, GRAPH containing -"nodes", "initializer" ,"attribute" etc,

    my question is - does "nodes", "initializer", "attribute" have any order with logic connection to neural network execution ? or it's ordering randomly?

    TIA

    yeuo

    nodes are sorted by their execution order. I believe others are just randomly stored.

    newoneincntk
    @newoneincntk
    Any one on line?
    natan katz
    @natank1_gitlab
    Hello .. I exported a torch model to ONNX (tained with GPU and exported on CPU). I cannot upload it in C#
    Dušan Josipović
    @dulex123
    Should importing model with onnxruntime with python and c++ be same? I've managed to import in python but not in c++
    natan katz
    @natank1_gitlab
    In C# you use onnxruntime from nuget
    Dušan Josipović
    @dulex123
    @natank1_gitlab c++ sorry
    natan katz
    @natank1_gitlab
    Did not get it. I talk about C# though C++ solution are good too.
    I managed to expoert in python not to upload in C#
    Dušan Josipović
    @dulex123
    what does upload mean?
    load model in c#?
    natan katz
    @natank1_gitlab
    yes
    the onnx
    Dušan Josipović
    @dulex123
    don't know
    trying to do the same in c++
    first time experience sucks
    natan katz
    @natank1_gitlab
    I know that for exporting in keras it does work
    natan katz
    @natank1_gitlab
    BTW Are there any onnx forums?
    Prasanth Pulavarthi
    @prasanthpul
    @natank1_gitlab @dulex123 ONNX runtime has both python, C#, and C++ APIs and any ONNX model will work with any of the language APIs. you'll need to share more details about the error you are seeing - you can post the details as an issue in https://github.com/microsoft/onnxruntime
    natan katz
    @natank1_gitlab
    Thanks
    Ebey Abraham
    @MrGrayCode

    Hi.. I seemed to run into this error while exporting onnx to caffe2. The error message doesn't help to understand the cause.


    IndexError Traceback (most recent call last)

    <ipython-input-8-20ea69495a2b> in <module>()
    ----> 1 onnx.utils.polish_model(model)

    1 frames

    /usr/local/lib/python3.6/dist-packages/onnx-1.6.0-py3.6-linux-x86_64.egg/onnx/optimizer.py in optimize(model, passes, fixed_point)
    53 optimized_model_str = C.optimize_fixedpoint(model_str, passes)
    54 else:
    ---> 55 optimized_model_str = C.optimize(model_str, passes)
    56
    57 return onnx.load_from_string(optimized_model_str)

    IndexError: Input 1 is undefined!

    What does "Input 1 is undefined" mean
    Prasanth Pulavarthi
    @prasanthpul
    @MrGrayCode it looks like you are using the optimizer utility. I'm not sure how well that component is maintained (there was some discussion about deprecating it). Once you have your ONNX model file exported, you can run it with https://github.com/microsoft/onnxruntime which applies various optimizations to the model and is compatible with any ONNX model.
    Noland Chaliha
    @yearofthewhopper
    hi, potentially ridiculous question here but what is the difference between an .onnx model and a .dnn model? Can onnx be constructed into .dnn or are they interchangable if the onnx is a dnn?
    Prasanth Pulavarthi
    @prasanthpul
    @yearofthewhopper what program/framework produces .dnn files?
    Guillaume Chevalier
    @guillaume-chevalier
    Thinking about writing a "onynx-neuraxle" library to make the Neuraxle framework compatible with Onynx: https://github.com/Neuraxio/Neuraxle
    Neuraxle is a framework compatible with TensorFlow, PyTorch and Scikit-Learn providing pipelining methods for doing both automatic machine learning and deployment.
    Any thoughts or starting points on this?
    Ke Zhang
    @linkerzhang

    Hi ONNX partners,

    Per ONNX community agreement, ONNX 1.7 release is being prepared. Key dates as below.

    1. PRs (for 1.7) merging due date – 2/25/2020.
    2. Release branch (for 1.7) creation date – 2/26/2020.
    3. 1.7 release date – 3/2/2020.

    Please active PR owners, mark your PR with “1.7 release” tag and actively pushing the merge before the due date.

    Karankumar
    @yesmkaran
    Hi
    how should I convert onnx file to tensorflow?
    With required dependencies
    Svetlana Levitan
    @sveta-levitan
    Karankumar
    @yesmkaran
    @sveta-levitan Actually, I tried to convert programmatically however, I got errors related to tensorflow which says - ImportError: no module name abs. And also when I try executing the import statements like from onnx.tf_backend import prepare..."the jupyter notebook kernel appears to have died and the kernel is restarting". So what version of tensorflow is suitable for onnx?
    I tried looking for errors on stackoverflow but still no proper solution for this
    Any help would be appreciated
    Chin Huang
    @chinhuang007
    @yesmkaran Quick answer, the latest onnx-tf converter release supports tf 1.x and the onnx-tf master supports tf 2.x. For additional questions and discussions, can you please open an issue at https://github.com/onnx/onnx-tensorflow? Thanks!
    Ke Zhang
    @linkerzhang
    Hi ONNX partners,
    Due to some key PRs not merged yet, per the community agreement, ONNX 1.7 release will be pushed out for one week. Key dates updated as below,
    a. PRs (for 1.7) merging due date – 3/3/2020.
    b. Release branch (for 1.7) creation date – 3/4/2020.
    c. 1.7 release date – 3/9/2020.
    Dor Marcous
    @engdorm
    Hi,
    How do I change data in TensorProto?
    I have initialized tensors in the graph and I want to pad it
    Does the C++ class have elegant way to change it (like [] or other help functions) ??
    Prasanth Pulavarthi
    @prasanthpul
    Latest steering committee notes are posted at https://github.com/onnx/steering-committee/blob/master/meeting-notes/20200227.md. TSC meetings are open to everyone to attend. Check the calendar for details.
    Hans Bouwmeester
    @Gra55h0pper
    @linkerzhang, does the 1.7 release go together with an onnxruntime release? I have a use-case for an operator ("Inverse") which needs version 12 of the operator set as per https://github.com/onnx/onnx/blob/master/docs/Operators.md#Inverse and I was curious when it may be supported? TNX!
    Ke Zhang
    @linkerzhang
    @Gra55h0pper ONNX runtime's release cadence is not coupled with ONNX
    But, ONNX runtime does have plan to do a release in May/June to enable ONNX 1.7 release, including opset 12.
    if you can rely on ONNX runtime master (build from source), the "Inverse" op may be implemented firstly :)
    Hans Bouwmeester
    @Gra55h0pper
    @linkerzhang, Thanks much for the information! I have tried building onnx-runtime (as a Python wheel) from source, but ran into some errors (on MacOS) that I have not tried to debug yet. (I'm able to build onyx itself from source without problems).
    I will give that another try as I prefer not to wait till May/June.
    Thanks again!
    Ke Zhang
    @linkerzhang
    @Gra55h0pper feel free to create issues in ONNX runtime community if you're facing with issues on building from source.
    @engdorm onnx repo does not have such helper functions, I believe. ONNX runtime may have that.
    @engdorm ONNX repo is supposed to have spec only, not that many helper/front-end related codes, but frameworks/runtimes which are following onnx standard should have, say, onnxruntime, pytorch, etc.
    thomasktruong
    @thomasktruong
    Hi ONNX Contributors, IBM will be hosting the next ONNX Community virtual meetup on April 9th. If you would like to present and share your company's latest activities with ONNX as part of the partner updates & end user stories, please contact Thomas Truong (jedi@us.ibm.com). Please sign up at https://events.linuxfoundation.org/lf-ai-day-onnx-community-virtual-meetup/. Final agenda and Zoom meeting info will be sent to all registered attendees in early April.
    Zhipeng Huang
    @hannibalhuang
    @thomasktruong sent one, plz check email :)
    Thomas Truong
    @ibmjedi
    ONNX Community Members, we are just a couple of weeks away from the April 9th ONNX workshop (virtually via Zoom online meeting) and still have open slots for partner updates & end user stories. Please let me (jedi@us.ibm.com) know ASAP if you would like to present. Also please do not forget to register (https://events.linuxfoundation.org/lf-ai-day-onnx-community-virtual-meetup/). I would like to finalize the agenda by the end of this week. Thank you and look forward to seeing you online on April 9th.