Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Mainak Jas
    @jasmainak
    @johnsam7
    for select_label in ['parsorbitalis', 'parstriangularis']:
        label = [l for l in labels if select_label in l.name][0]
    Adonay Nunes
    @AdoNunes
    @bloyl today is the day, we will test your code for automatic muscle artifact detection and, assuming it works nicely, you could PR and implement it in mne-python once for all
    Alexandre Gramfort
    @agramfort
    @/all the mne web app I showed yesterday with binder integration is here https://github.com/agramfort/mne-web-apps it's really for inspiration
    Alex Rockhill
    @alexrockhill
    Working on current source density using Dennis's code... if anyone who has a better idea how this should be implemented had 5 minutes to look into it with me that would be helpful
    mshamalainen
    @mshamalainen
    Mainak Jas
    @jasmainak
    hello @/all please help us improve the contributing guide by commenting here: mne-tools/mne-python#6898
    Daniel McCloy
    @drammock
    @bloyl there is an env. variable MNE_LOGGING_LEVEL
    Robert Oostenveld
    @robertoostenveld
    Please find my slides on https://www.dropbox.com/s/vyrl55t2okjhk6o/20191003_mne_sprint_bids.pptx?dl=0 and do vote! Oh, I did not mention that I am in one of the proposed steering groups…
    Robert Oostenveld
    @robertoostenveld
    The quick link that takes you to the email with the information about the BIDS voting is https://groups.google.com/forum/#!topic/bids-discussion/tuxK_nrMX38
    Alexandre Gramfort
    @agramfort
    here is the link to vote https://forms.gle/YT5SvPgACHhVvf6x8
    olafhauk
    @olafhauk
    Has anyone used grow_labels to create a whole-cortex set of searchlights?
    Alexandre Gramfort
    @agramfort
    subj
    Eric Larson
    @larsoner
    @alexrockhill 5275c0989dff8007811103b951dfb487 is your 0.74 MD5
    Alex Rockhill
    @alexrockhill
    Thanks!
    Adonay Nunes
    @AdoNunes
    @agramfort we would like the slides of yesterday talk
    Mohamed Sherif
    @mohdsherif
    @agramfort yes, please
    fmamashli
    @fmamashli
    @agramfort me too
    Alexandre Gramfort
    @agramfort
    Mohamed Sherif
    @mohdsherif
    thanks - do you also have a link to the jupiter notebooks?
    Alexandre Gramfort
    @agramfort
    Robert Oostenveld
    @robertoostenveld
    @jasmainak regarding realtime, please check out http://www.eegsynth.org and https://github.com/eegsynth/eegsynth which are our more ongoing efforts for real-time EEG processing. That also contains the latest python code for real-time processing, fieldtrip buffer and LSL interfacing. The background is that we use it in our http://www.oneplusoneisthree.org art collective performances.
    Mainak Jas
    @jasmainak
    @jcmosher and @mshamalainen could you share your slides as well?
    Alexandre Gramfort
    @agramfort
    Daniel McCloy
    @drammock
    @mohdsherif @klankinen @NataKozh if anyone wants another "easy" fix, you could add a link to this page:
    to our documentation of fsaverage.
    Robert Oostenveld
    @robertoostenveld
    dear all, I also made a short report of what I worked on this week, including links to the online material. Please see https://www.dropbox.com/s/d459ydvg9xdtyov/20191004_mne_sprint_report.pptx?dl=0
    olafhauk
    @olafhauk
    In case anyone has suggestions on how to visualise and evaluate spatial resolution for EEG/MEG, I implemented the tools required to produce the results for linear methods in https://www.biorxiv.org/content/10.1101/672956v1
    Alex Rockhill
    @alexrockhill
    @johnsam7 Would you also be willing to share your slides? I think the cerebellum is outstanding 👍 :)
    John C Mosher
    @jcmosher
    @jasmainak Here are dropbox links to my two sets of slides I presented on Wednesday evening. The beamforming covariance talk is at https://www.dropbox.com/s/8vivu1ftzt2alj1/2019_ACMEGS_Mosher_Beamforming_Las_Vegas.pptx?dl=0, and the Okada Constant is at https://www.dropbox.com/s/nn1uptbhdyc5aly/2019_Equinoxe_Reunion_Okada_Constant.pptx?dl=0.
    mshamalainen
    @mshamalainen
    My Monday presentation (https://www.dropbox.com/s/qei5gitrmqnnws3/1909-MEGinverse-WoodsHole.pdf?dl=0) and the brief talk about "Ilmoniemi graphs" (https://www.dropbox.com/s/le27q2t7t02zz9x/Ilmoniemi-graphs.pdf?dl=0) are now in the dropbox.
    John Samuelsson
    @johnsam7
    @alexrockhill Here are the cerebellum slides: https://www.dropbox.com/s/bfh5ylipo3qygq0/MNE_sprint.pptx?dl=0
    1337!
    Adonay Nunes
    @AdoNunes

    I found out why python IDEs were so slow in my mac. It turns out that there are some back ends (qt, qt5, osx) that mess with the app nap and each typed command it takes about 10 secs to run.
    To check it, run several times the following line:import time; start = time.time(); time.sleep(0.0005); print('dt = ', time.time() - start)

    if it starts to take about 10 sec. then it can be fixed by:
    conda install appnope
    then importing:
    import appnope
    appnope.nope()

    and this excruciating and annoying issue will be solved!

    Mainak Jas
    @jasmainak
    Thanks a lot @mshamalainen and @jcmosher . All the talks in the sprint were great!
    Alex Rockhill
    @alexrockhill
    And thanks @johnsam7!
    Eric Larson
    @larsoner
    @AdoNunes interesting about appnope, never heard of that but it's good to know! Feel free to make an entry in our FAQ section if you want
    Adonay Nunes
    @AdoNunes
    @larsoner PR sent!
    Alex Rockhill
    @alexrockhill
    I'm having issues with the coreg gui. I assume it's just with my setup. If anyone had the time to help me with that, that would be awesome
    Adonay Nunes
    @AdoNunes
    @alexrockhill I had the same issue, it worked setting in the terminal: export ETS_TOOLKIT=qt
    Eric Larson
    @larsoner
    ahh yes MacOS
    did you pip install "PyQt5>=5.10"
    ?
    python -c "import mne; mne.sys_info()" should tell you
    because under mayavi it will give you your PyQt5 version