Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    padmasreenagarajan
    @padmasreenagarajan
    Okay..I have opened an issue under onnx-tensorflow which can be found in the below link
    onnx/onnx-tensorflow#764
    @prasanthpul Do u have any solution for this?
    Jim Spohrer
    @jimspohrer
    Reminder: Next ONNX Community Meeting/Workshop online Oct 14, 7-10am PT. Got an ONNX use case to present? Please let the ONNX Steering Committee know - and get on the agenda. Join LF AI Slack and onnx-general channel for updates and more information - https://slack.lfai.foundation
    Prasanth Pulavarthi
    @prasanthpul
    @padmasreenagarajan i don't. most of us use ONNX Runtime (https://onnxruntime.ai) for inferencing
    Jim Spohrer
    @jimspohrer
    Reminder: Join Slack to be part of ONNX SIG & Working Group, Steering Committee, Release, and General discussion - many have moved already thank-you! Sign-up here: https://slack.lfai.foundation - then go to channel + browse, and add slack channel
    padmasreenagarajan
    @padmasreenagarajan
    Hi all!
    padmasreenagarajan
    @padmasreenagarajan
    For inferencing Mobilenetv3, I am using TIDL by Texas Instruments.
    h-swish is an activation function which is not supported by TIDL import tool.
    So, Could anyone suggest an alternative to "h-swish"?
    Or Is there any possibility to add "h-swish" operator in onnx operator set?
    11 replies
    Jim Spohrer
    @jimspohrer
    @ShuangLiu may have pointers - Tencent/ncnn#1402
    1 reply
    ONNX Community Meeting Workshop event website and registration are open - see onnx-general slack channel for more details. Join ONNX Slack via https://slack.lfai.foundation - then go to channel + browse, and add slack channel
    Ofir Zafrir
    @ofirzaf
    Hi, i am interested in onnx runtime quantized inference and i can't find detailed documentation about the existing options with quantization and how can i know what kernel will actully be running after converting the model using the onnxrun time quantization api
    1 reply
    Tiberio
    @tiberiusferreira
    Hello everyone!
    Screen Shot 2020-09-27 at 22.32.24.png
    I'm trying to create a simple model and train it using onnx runtime, but I can't find an equivalent of sess = onnxruntime.InferenceSession(path_save) for training.
    2 replies
    Tiberio
    @tiberiusferreira
    I guess https://github.com/microsoft/onnxruntime/blob/d9ecc0cebf8752893d4cd4e547341e390dc29b01/orttraining/orttraining/python/ort_trainer.py#L542 is what I'm looking for? I thought there would be something similar to onnxruntime.InferenceSession .
    1 reply
    Dimitra Karatza
    @dimitraka
    Hello everyone, I am using the Tensorflow backend for onnx, to convert an onnx model to tensorflow. However, after the conversion, the graph of the converted model is broken.
    image_2020_09_29T12_16_52_111Z.jpg
    image_2020_09_29T12_17_01_532Z.jpg
    Is there any way to fix this?
    1 reply
    Alex Garustovich
    @Yukigaru
    Hi chat, I've made a PR to onnx/onnx, but it's not reviewed. How can I drag attention to it? Thanks!
    1 reply
    This one: onnx/onnx#3036
    Jim Spohrer
    @jimspohrer
    The ONNX Steering Committee is excited about the program for next weeks community meeting : https://events.linuxfoundation.org/lf-ai-day-onnx-community-virtual-meetup-fall/program/schedule/
    ONNX Community Meeting Fall 2020 - Wednesday Oct 14 10-1pm ET - Register Here: https://events.linuxfoundation.org/lf-ai-day-onnx-community-virtual-meetup-fall/register/
    Jim Spohrer
    @jimspohrer
    @harryskim > As Jim mentioned, we are migrating Gitter to Slack. Please sign up for LF AI slack using https://slack.lfai.foundation/ and join "onnx-general" channel.
    Ke Zhang
    @linkerzhang
    @Yukigaru both gitter (here) and slack are good places to drag such attention :).
    Brian Chen
    @ToucheSir
    Quick question: what's the best place to ask spec/implementation-related questions? I tried onnx-general on Slack, but it seems to be primarily announcements/development talk...
    2 replies
    While I'm at it, what's the difference between onnx and onnx-general?
    yangkkokk
    @yangkkokk
    HI,我是新来的
    T.taisuke
    @taisuke-tomida
    PyTorch => ONNX This is how I've converted it. Then I want to convert it to "tflite", is there any way to do that?
    I think I need to convert "channel_first" to "channel_last", but I don't know how.
    T.taisuke
    @taisuke-tomida
    Does "onnx-tensorflow" internally convert "NCHW" to "NHWC"? GitHub:https://github.com/onnx/onnx-tensorflow/blob/master/example/onnx_to_tf.py
    Chun-Wei Chen
    @jcwchen
    Hi ONNX community,
    TestPyPI Packages for ONNX 1.8.0 are available now: https://test.pypi.org/project/onnx/1.8.0rc0/#files (All versions) Please let us know if there is any problem with your usage and finish the verification by the end of this week. Thank you!
    jiang jianjun
    @jianjunjiang
    I write a lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support.
    using onnx model for pure c99 inference
    and support hardware acceleration interface
    if you interresting, please star! thanks
    the licenese is MIT
    hududed
    @hududed
    Hi @SelComputas was wondering if youve found a solution to creating heatmaps/pca/explainability to onnx models?
    Chun-Wei Chen
    @jcwchen
    Hi ONNX community,
    I am happy to announce that ONNX 1.8.0 has been released. https://github.com/onnx/onnx/releases/tag/v1.8.0
    PyPI packages are available here: https://pypi.org/project/onnx/ Conda packages will also be available soon. Thank you everyone.
    mjk1323
    @mjk1323
    How does ONNX connect to data sources to process a model with?
    Prasanth Pulavarthi
    @prasanthpul
    If you use ONNX Runtime or have tried it out before, you can provide feedback to the team via this brief survey: https://aka.ms/ort-survey
    Deepak Chauhan
    @meedeepak
    Hi, need help with this one microsoft/onnxruntime#5834