Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Alexandre Gramfort
    @agramfort
    @jasongrout do you have any idea of how we could fire such a brain plot from a standard ipython session? before we all switch to jupyter lab :)
    Jason Grout
    @jasongrout
    I've been working in jupyter notebook
    not lab
    Alexandre Gramfort
    @agramfort
    sure but just to be sure we're on the same page, I work from ipython sessions started with : ipython --matplotlib
    old school
    Jason Grout
    @jasongrout
    widgets need the jupyter notebook...
    huh...I haven't thought about the usecase of popping up a browser window when working in the console
    there's no technical reason why it couldn't be made to work...
    Alexandre Gramfort
    @agramfort
    let me know if you see a way :)
    Jason Grout
    @jasongrout
    we already have a way of rendering widgets in a static webpage
    Alexandre Gramfort
    @agramfort
    it's just "an engineering" problem #humor :)
    Jason Grout
    @jasongrout
    yeah, someone would just need to write the code to pop up a special webpage with ipywidgets loaded and connected to the kernel
    all the pieces are there, someone would just need to put it together (which would probably require a nontrivial amount of glue code, but pretty straightforward, I would think)
    Alexandre Gramfort
    @agramfort
    that would be amazing
    Jason Grout
    @jasongrout
    (the pieces being: we already have code that renders widgets on a static webpage, and we already have javascript code to connect and talk to a kernel, and it's not too hard to pop up a new browser window. The browser page would have to be served up over a webserver that someone launches, and presumably the page would have the kernel information to connect to the kernel you launched it from)
    an application that can directly access the python memory will be faster on a local machine, of course - we still have the limitation of having to copy the data to the browser.
    of course, the plus is that a browser-based solution can be hosted on mybinder, like the LIGO project did, so you get much wider reach
    Jason Grout
    @jasongrout
    all right: I'll tinker more with pythreejs, and I'll work on getting the next ipywidgets out so that we can use the fast data transfer in it.
    Chris Holdgraf
    @choldgraf
    @jasongrout just want to say thanks for your help, I'm glad you stopped by our sprint! Excited to see what comes next
    Jason Grout
    @jasongrout
    Thanks for inviting me! It was a lot of fun. Sorry I didn't make it down today
    Jason Grout
    @jasongrout
    @choldgraf, @agramfort - here's a demo of how fast it is now: https://www.youtube.com/watch?v=WSyflGpJPvE
    Jason Grout
    @jasongrout
    (and there still are some optimizations that I think speed things up more - for example, when I change the color here, I'm naively completely reconstructing all of the geometry, not just setting the vertex colors...)
    Jean-Rémi KING
    @kingjr
    Waaaaa
    Alexandre Gramfort
    @agramfort
    @jasongrout I am bluffed !
    Chris Holdgraf
    @choldgraf
    Man that's beautiful
    Christoph Dinh
    @chdinh
    Not taking away the buzz. It is impressive - Even though the slider update seems to have the same obstacle like the current stc browser.
    Jason Grout
    @jasongrout
    the slider update is causing a new color array to be constructed, sent to the browser, and the whole geometry is reconstructed and rerendered
    there still are some trivial things to speed this up, like:
    (a) transfer uint8 rgb tuples instead of float32 rgb tuples -> 1/4 the data transfer
    (b) just set the color array, instead of reconstructing the whole geometry, recomputing normals, etc.
    let's see how close that gets us to real-time color changes
    also, I'm transferring a new color for every vertex. If you're only updating some vertices, likely you can speed things up a lot more by just transferring the changed data.
    So @chdinh - still definitely a work in progress
    I'm not sure what 'stc browser' is, but I have hopes we can get close to real-time updates
    Christoph Dinh
    @chdinh
    @jasongrout what I was referring to: is the activity coloring per vertex. An stc contains this activity over time in the most cases for a subset of vertices, e.g., 10k vertices. So what has to be done is a smoothing interpolation to assign an activity color to the rest of the 360k vertices. This can be done in the ideal case with a compute shader.
    Jason Grout
    @jasongrout
    that makes sense to use a shader for something like that.
    we have shaders exposed in three.js. or we could use a new package using something like regl on the js side.
    this was a proof-of-concept - definitely just the first step.
    we have basic support for shaders in pythreejs, but we should upgrade to newer versions of threejs, probably, and experiment with it.
    (see https://threejs.org/docs/#Reference/Materials/ShaderMaterial - we'd make a new material that contained the shader program...)
    Jason Grout
    @jasongrout

    this was a proof-of-concept - definitely just the first step.

    Perhaps the thing to do is (a) experiment with real usecases, and (b) map out what would be useful to do moving forward. What workflows make sense in the notebook, how easy is it to take it out of the notebook if that's useful, etc.

    Chris Holdgraf
    @choldgraf
    @jasongrout this is all great, thanks for your help! Sounds like this is a good plan - we'll iterate on this a bit in the coming weeks (may take a little bit since folks are traveling) and will def be in touch!
    Chris Holdgraf
    @choldgraf
    I opened an issue here to make sure that we don't lose track of this thread mne-tools/mne-python#4150
    Alexandre Gramfort
    @agramfort
    @/all we should agree asap when to do the next sprint. A big sprint before the end of 2017 seems too short to setup. any preference? Paris or Berkeley if we can have some support from BIDS seems a good plan for me.
    Jean-Rémi KING
    @kingjr
    +1 for 2018, ideally not overlapping with a big cog/neuro/ML conference
    Alexandre Gramfort
    @agramfort
    march or april could do it?
    Chris Holdgraf
    @choldgraf
    I'm +1 for 2018 too...Berkeley would probably work but, here's the thing, next year is going to be busy for me...around march/april I'll be in the final months of planning not one but two weddings (hooray for international relationships)
    the only other concern I have about Berkeley is that the Bay Area is a pain in the butt to get to, considering most of the MNE team is on the east coast or in Europe
    BUT, I'm just posting these as concerns, not hard-lines...JR / Alex, how much time did it take to organize this before? I am pretty sure that BIDS would be able to get us a room, but it wouldn't be a big conference room like we had at NYU (BIDS doesn't have as much space as NYU has)