Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
  • Jan 31 2019 22:33
    codecov[bot] commented #5895
  • Jan 31 2019 22:32
    larsoner synchronize #5895
  • Jan 31 2019 21:16
    codecov[bot] commented #5895
  • Jan 31 2019 21:16
    larsoner synchronize #5895
  • Jan 31 2019 20:23
    agramfort commented #5768
  • Jan 31 2019 18:31
    larsoner opened #5895
  • Jan 31 2019 18:31
    larsoner milestoned #5895
  • Jan 31 2019 17:58
    codecov[bot] commented #5822
  • Jan 31 2019 17:16
    GuillaumeFavelier commented #5822
  • Jan 31 2019 17:13
    codecov[bot] commented #5822
  • Jan 31 2019 17:13
    GuillaumeFavelier synchronize #5822
  • Jan 31 2019 17:01
    jshanna100 commented #5807
  • Jan 31 2019 16:11
    larsoner commented #5807
  • Jan 31 2019 15:36
    jshanna100 commented #5807
  • Jan 31 2019 15:19
    codecov[bot] commented #5807
  • Jan 31 2019 15:07
    GuillaumeFavelier commented #5822
  • Jan 31 2019 14:18
    codecov[bot] commented #5768
  • Jan 31 2019 14:01
    codecov[bot] commented #5807
  • Jan 31 2019 14:01
    jshanna100 synchronize #5807
  • Jan 31 2019 13:45
    GuillaumeFavelier synchronize #5822
thanks, it
it works~~
刘政(Barry Liu)
That's fine.

Hi, everyone!
I am trying to estimate local activity using inverse solution or tutorial "Compute MNE inverse solution on evoked data with a mixed source space".
Thanks to the function mne.extract_label_time_course, I could extract time courses in the local area.
However, it does not look like usual EEG data, but like a power activity.
The figure I attached is the result.
How do you convert it from power activity to usual EEG data?

Additionally, how do you get label index(for example; in the above tutorial, bankssts-lh is 0 and Brain-stem is -1)?

Richard Höchenberger

@/all We are very happy to let you know we've set up a new forum to make it easier for users to get help and discuss all things MNE. 🥳 Hopefully, this will also create a searchable archive of all discussions, problem descriptions, and solutions. It will also make it easier for the developers to get an idea of where exactly they may need to fine-tune or re-think parts of MNE. I'd like to invite you all to come and join us over at https://mne.discourse.group

You'll be the first users there, and we'd love to hear your feedback on all things you like about the forum, and of course also the stuff that needs improvement.

If you asked a question here on Gitter that has gone un-answered so far – please consider posting it again to the forum. It will be much easier for everyone to see your question, and to provide an answer. 💪 Thank you!

2 replies
What I want to do is truncate this graph in the middle, putting -1 second to 6 seconds and 6 seconds to 12 seconds into one timeline, overlapping it.
This is the picture I want

fig, axes = plt.subplots(2, 1, figsize=(10, 7), sharex=True, sharey=True)
colors = plt.get_cmap('winter_r')(np.linspace(0, 1, 4))
for ((freq_name, fmin, fmax), average), color, ax in zip(
frequency_map, colors, axes.ravel()[::-1]):
times = average.times 1e3
gfp = np.sum(average.data *
2, axis=0)
gfp = mne.baseline.rescale(gfp, times, baseline=(None, 0))
ax.plot(times, gfp,label=freq_name, color=color, linewidth=2.5)
ax.axhline(0, linestyle='--', color='grey', linewidth=2)
ci_low, ci_up = bootstrap_confidence_interval(average.data, random_state=1,
ci_low = rescale(ci_low, average.times, baseline=(None, 0))
ci_up = rescale(ci_up, average.times, baseline=(None, 0))
ax.fill_between(times, gfp + ci_up, gfp - ci_low, color=color, alpha=0.3)
ax.annotate('%s (%d-%dHz)' % (freq_name, fmin, fmax),
xy=(0.95, 0.8),
xycoords='axes fraction')
ax.set_xlim(-1000, 12000)

axes.ravel()[-1].set_xlabel('Time [ms]')

This is my code
Nitish Gupta

Hello everyone, I am Nitish Gupta a B.Tech. undergraduate from India. I was exploring the project when I came across the idea Facilitate access to open EEG/MEG databases.
I see that currently we support a quite a few datasets mainly hosted on osf.io.
The URL mapping of these datasets has been done inside utils.py, with their names, hashes, after which they are passed over to the downloader.
I have also been exploring some Open EEG data sources listed here, since every dataset I come across has a permanent download link, some ways I could think of were -

  • Manually finding and mapping more dataset download links the same way it is done currently inside mne/datasets/utils.py
  • Creating separate instances and grouping data to serve it in a new form to the users, for an instance data from brainstorm could be clubbed together and then served as a single API call to mne.datasets.brainstorm('resting') instead of bst_resting.data_path() which is used currently.

I was wondering what other possible approaches or ideas others have on this. Would love to hear some input on this from the community.

Clemens Brunner
@imnitishng @wz2019235020 please use the new MNE Forum for asking questions: https://mne.discourse.group/

I'm wondering if there were any changes betwenn mne 0.21 and 0.22 regarding usage of multiple cpus/cores. I'm running mne-based code in jupyterlab inside a docker. Running my code in a docker (based on jupyter/scipy-notebook). When running on a few months old version of the docker image, where mne 0.21 is available, the code utilizes all 8 available cores on the machine, as visible in top (in an empressively efficient manner, btw.). When I updated to a more recent version of the same docker, and to mne 0.22, the exact same code (I mount it into the docker from the outside, so it's identical in both runs) uses only 1 core, with the process lingering in 102-105% in top.

Do you know if there have been any changes on the MNE side that might cause this change in behaviour? Or should I be looking at the change in jupyter-provided docker base images?

Jupyter lab versions bump from 2.2.8 to 2.2.9 between the images. Python goes from 3.8.5 (compiled with gcc 7.5.0) to 3.8.6 (compiled with gcc 9.3.0).
A jump in major version in the compiler toolchain, is it possible that this causes different behaviour with utilizing cores in OMP/OPENBlas?
Oh, and I'm running the code through jupyter lab

If someone runs into my issue : I'm using conda-forge as primary package channel. For some reason, it defaults to the non-mkl blas implementations, so by adding

- "libblas=*=*mkl"

to my environments.yml file, it switches to mkl. No idea why it was using mkl before, though, since my environment.yml file didn't change when I updated.


Hello, I am trying to plot connectivity using plot_sensor_conectivity. However I get follwoing error:

BackgroundPlotter has moved to pyvistaqt. You can install this from PyPI with: pip install pyvistaqt Then import it via: from pyvistaqt import BackgroundPlotter BackgroundPlotter is no longer accessible by pyvista.BackgroundPlotter

I am using Jupyter notebook.
Thanks for a nice tool!

Alexandre Gramfort
@indridieinarsson @LunaHub please use the new MNE Forum for asking questions: https://mne.discourse.group/
how can change the dpi mne'plot function?thanks
how can change the figure dpi in mne'plot function?thanks
Hello , I am trying to use "Plot a cortical parcellation" (through spyder)
on my own data, and this mne-example
Brain = mne.viz.get_brain_class()
I got error :
AttributeError: 'module' object has no attribute 'get_brain_class'
python -c "import mne; mne.sys_info()"
> > Python:        3.8.6 | packaged by conda-forge | (default, Dec 26 2020, 05:05:16)  [GCC 9.3.0]
> > mne:           0.22.0
> > numpy:         1.19.5 {blas=NO_ATLAS_INFO, lapack=lapack}
> > scipy:         1.6.0
> > matplotlib:    3.3.3 {backend=Qt5Agg}
> > sklearn:       0.24.0
> > numba:         0.52.0
> > nibabel:       3.2.1
> > nilearn:       0.7.0
> > dipy:          1.3.0
> > cupy:          Not found
> > pandas:        1.2.0
> > mayavi:        4.7.2
> > pyvista:       0.27.4 {pyvistaqt=0.2.0, OpenGL 4.6 (Core Profile) Mesa 20.0.8 via Mesa DRI Intel(R) UHD Graphics 630 (CFL GT2)}
> > vtk:           9.0.1
> > PyQt5:         5.12.3
Richard Höchenberger

Hello @LeilaNS, welcome! Could you please post your question to our brand new shiny MNE forum? Your question will be seen by more users (and developers!) there, and chances are much higher that we'll manage to resolve your issue together!

You can find the forum at https://mne.discourse.group

Thank you!

Rahul Raj
Hi everyone, I'm Rahul. I want to contribute to the organisation. Can someone point me in the right direction and project setup?
刘政(Barry Liu)
And now we post our questions to the brand new shiny MNE forum. https://mne.discourse.group/
Hello everyone, I am Kabir. I am new here also I am looking forward to learning some cool stuff from this organization. How can I get started? Thanks
Mikolaj Magnuski
@kabirpack Hi, Kabir. We are not longer active on gitter - we chose Discourse instead. See the channel here: https://mne.discourse.group/
Thanks for the info @mmagnuski
Hi! Excuse me, I have small question, when I used notch filter
raw.notch_filter(np.arange(50,100), n_jobs=1, fir_design='firwin')
I had "The requested filter length 6601 is too short for the requested 0.00 Hz transition band, which requires 440001 samples"
Could you tell me how calculate parameter " filter_length", because I need to change it according message
Clemens Brunner
@NastyaPolt please use our new forum at https://mne.discourse.group/
Jean-Rémi KING
I'm trying to read a cnt EEG file with mne.io.read_raw_cnt and get "UnicodeDecodeError: 'ascii' codec can't decode byte 0x93 in position 6: ordinal not in range(128)", any idea what to do?
Kamal Chauhan
Hello everyone, I am Kamal Chauhan. I am new here also I am looking forward to learning some expirence from this organization. How can I get started? Thanks
Stefan Appelhoff

Hey @kamalc218:matrix.org if you are here about GSoC, this is the page for you: https://github.com/mne-tools/mne-python/wiki/GSoC-Ideas

it still mentions gitter (we need to update that), but we do not actually use gitter anymore. We use https://mne.discourse.group/ if you have any usage questions, and GitHub issues if you have any feature requests, bug reports and the like

gitter (this channel) is deprecated

I have few doubts on raw eeg data processing. 1. How to create channel locations of the eeg signal acquired from hospital? 2. Im getting index error when tried to pick specific channels(IndexError: index 0 is out of bounds for axis 0 with size ). How can I fix this? 3. I am getting attribute error also for the hospital eeg signal(AttributeError: 'RawEDF' object has no attribute 'encode' ). Could you please tell me how to fix these errors.
Stefan Appelhoff
@mshree736 please head over to https://mne.discourse.group/ to ask your question(s), this channel (gitter) is deprecated.
I'm having issues with reading channel locations from a .bdf file. When I run raw.plot_sensors() It gives me a runtime error: No valid channel positions found.
Rodrigo Hübner
Hello guys!
I need a help about an error just i run epoch.filter(l_freq=20., h_freq=450.)
Error: ValueError: picks (NoneNone, treated as "data_or_ica") yielded no channels, consider passing picks explicitly
Rodrigo Hübner
That error happen when I use epoch.plot_psd too...
I am working with sEMG data... Anyone known what is it?
Rodrigo Hübner
haaaa ok... I will send in https://mne.discourse.group/ (I saw just now
Christian O'Reilly
In the estimation of current source density, MNE-Python uses a stiffness and a regularization parameter. Whereas the stiffness parameter is described in the Perrin paper of CSD, I have not seen any reference to regularization. Is the addition of regularization is something specific to MNE-Python or is it follow some previously published work?
Christian O'Reilly
It seems to me as the regularization parameter can be used to control the spatial scale of the information in the CSD but I am tempted to set the regularization to 0 to have the "vanilla" implementation of CSD. However, since MNE-Python default is not 0, I imagine the default value 10^-5 has been validated and found to be optimal. If such validation exists (aside from https://mne.tools/dev/auto_examples/preprocessing/plot_eeg_csd.html), it would be great to know where it is published
Just seen the thread on the new forum. I will post this question there.