Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 13:29
    francisco-dlp commented #2256
  • 13:26
    JoonatanL commented #2255
  • 13:24
    thomasaarholt commented #2240
  • 13:23
    francisco-dlp labeled #2255
  • 13:22
    francisco-dlp labeled #2255
  • 13:22
    francisco-dlp labeled #2255
  • 13:22
    francisco-dlp commented #2255
  • 13:19
    thomasaarholt edited #2256
  • 13:19
    thomasaarholt edited #2256
  • 13:18
    thomasaarholt opened #2256
  • 13:15
    dnjohnstone commented #2255
  • 12:42
    JoonatanL opened #2255
  • Sep 18 11:08
    magnunor commented #1497
  • Sep 18 09:58
    dnjohnstone commented #1497
  • Sep 17 13:36
    dnjohnstone commented #1497
  • Sep 17 06:33
    AEljarrat commented #2251
  • Sep 17 06:33
    AEljarrat commented #2251
  • Sep 16 20:32
    thomasaarholt commented #2240
  • Sep 16 13:41
    thomasaarholt commented #2251
  • Sep 16 07:12
    AEljarrat commented #2251
MurilooMoreira
@MurilooMoreira
Hum, I see, thank you Francisco. So, if i have this value and multiplies by the total number of components of my decomposition (the number of energy channels), I will obtain the value of my component variance?
jeinsle
@jeinsle

hiya, back to my loading big EDS maps question from above. After using a hack the @sem-geologist suggested over on Github, I can get all my BCF files to load. but now I want to start exploring the data using the decomposition tools of hyperspy. I have loaded using the lazy signal,
sig=hs.load('hdf5/BA_map_*.hspy',stack=True, lazy=True) sig

which gives me an object:
<LazyEDSSEMSpectrum, title: hdf5, dimensions: (999, 999, 108|2048)>

but when I try to run my stack trhough PCA I keep getting the error

Axis value must be an integer, got range(0, 3)

any suggestions on what is causing the problem. I will dump the full error code below.

here is the full error:
`----> 2 sig.decomposition(True, algorithm='PCA', output_dimension=20)
3 #sigt.plot_decomposition_results()
4 sig.plot_explained_variance_ratio(log=False)
5 sig.plot_explained_variance_ratio(log=True)

~\AppData\Local\conda\conda\envs\hyperspy2\lib\site-packages\hyperspy_signals\lazy.py in decomposition(self, normalize_poissonian_noise, algorithm, output_dimension, signal_mask, navigation_mask, get, num_chunks, reproject, bounds, **kwargs)
762 sdim = self.axes_manager.signal_dimension
763 bH, aG = da.compute(
--> 764 data.sum(axis=range(ndim)),
765 data.sum(axis=range(ndim, ndim + sdim)))
766 bH = da.where(sm, bH, 1)

~\AppData\Local\conda\conda\envs\hyperspy2\lib\site-packages\dask\array\core.py in sum(self, axis, dtype, keepdims, split_every, out)
1754 from .reductions import sum
1755 return sum(self, axis=axis, dtype=dtype, keepdims=keepdims,
-> 1756 split_every=split_every, out=out)
1757
1758 @derived_from(np.ndarray)

~\AppData\Local\conda\conda\envs\hyperspy2\lib\site-packages\dask\array\reductions.py in sum(a, axis, dtype, keepdims, split_every, out)
229 dt = getattr(np.empty((1,), dtype=a.dtype).sum(), 'dtype', object)
230 return reduction(a, chunk.sum, chunk.sum, axis=axis, keepdims=keepdims,
--> 231 dtype=dt, split_every=split_every, out=out)
232
233

~\AppData\Local\conda\conda\envs\hyperspy2\lib\site-packages\dask\array\reductions.py in reduction(x, chunk, aggregate, axis, keepdims, dtype, split_every, combine, name, out, concatenate, output_size)
127 if isinstance(axis, int):
128 axis = (axis,)
--> 129 axis = validate_axis(axis, x.ndim)
130
131 if dtype is None:

~\AppData\Local\conda\conda\envs\hyperspy2\lib\site-packages\dask\array\utils.py in validate_axis(axis, ndim)
142 return tuple(validate_axis(ax, ndim) for ax in axis)
143 if not isinstance(axis, numbers.Integral):
--> 144 raise TypeError("Axis value must be an integer, got %s" % axis)
145 if axis < -ndim or axis >= ndim:
146 raise AxisError("Axis %d is out of bounds for array of dimension %d"

TypeError: Axis value must be an integer, got range(0, 3)`

Thomas Aarholt
@thomasaarholt
@k8macarthur I've loaded recently-acquired Velox images from Velox 2.9 without issue
@jeinsle Could you try slicing the data to get a tiny dataset (sig2 = sig.inav[:10,:10,:5] should be fine) and then running sig2.compute() followed by sig2.decomposition(True, algorithm='PCA', output_dimension=20)? I'm just wondering if the problem is with the lazy part or the regular decomposition part.
Francisco de la Peña
@francisco-dlp
@jeinsle, PCA doesn't work properly in lazy mode even when it works. We need to fix that. NMF should work better, but setting the different parameters is not straightforward. I would tryp optimizing the NMF parameters in a small section of the dataset and then go for the whole thing.
Francisco de la Peña
@francisco-dlp
@MurilooMoreira, you get the explained variance in explained_variance only when setting centre=True and normalize_poissonian_nose=False, this is the only decomposition that should be called PCA. But I don't advice you to do that, since then the decomposition will be worst (more components and more noise). The thing is that, when we use different settings (typically centre=False) we still call it PCA, but that's not PCA, just plain SVD. Therefore, what you get in the wrongly name attribute explained_variance is the singular values squared and divided by the number of components. This becomes the explained variance only when using standard PCA by using the settings mentioned above.
Thomas Aarholt
@thomasaarholt
@francisco-dlp are we tracking the PCA lazy problem in an issue?
jeinsle
@jeinsle

@jeinsle, PCA doesn't work properly in lazy mode even when it works. We need to fix that. NMF should work better, but setting the different parameters is not straightforward. I would tryp optimizing the NMF parameters in a small section of the dataset and then go for the whole thing.

ahh that is great to know. I can maybe do that. thing for me is to figure out how many of the 108 tiles are representative to work with. Do any of the community here have methods documented for working with sections of a dataset and then generalizing?

jeinsle
@jeinsle

@jeinsle Could you try slicing the data to get a tiny dataset (sig2 = sig.inav[:10,:10,:5] should be fine) and then running sig2.compute() followed by sig2.decomposition(True, algorithm='PCA', output_dimension=20)? I'm just wondering if the problem is with the lazy part or the regular decomposition part.

Yeah this cropped down version seems to be running. so I will now look into how to break down big datasets and piece together solutions.

SBW90
@SBW90

Hi, I'm trying to get the intensity values for different diffraction spots. I'm using interactive ROI to select the spots:

roi = hs.roi.CircleROI(20, 20, 20, r_inner=0) signal.plot() roi_circ = roi.interactive(signal, color='red')

and then using
roi_circ.events.data_changed.trigger(roi_circ)
roi_circ.data.sum()

to return the intensity values. Currently I am re-running the cell to get the intensity values. Is there a way to continuously return the roi_circ.data.sum() every time I move the ROI?

Thanks

Francisco de la Peña
@francisco-dlp
Yes, for that interactive is your friend.
Thomas Aarholt
@thomasaarholt
Trying to investigate the failing tests of #2240, I'm noticing that on master, running pytest --mpl hyperspy\tests\drawing\test_plot_signal2d.py results in a lot of failed image comparisons.
The following are the original, test-generated and difference images:
baseline-test_plot_multiple_images_list_None-None.png
test_plot_multiple_images_list_None-None.png
test_plot_multiple_images_list_None-None-failed-diff.png
Are mpl-tests platform-dependent?
MurilooMoreira
@MurilooMoreira
@francisco-dlp Ok Francisco, it helped me a lot, thank you!
Yueming Guo
@DrYGuo
a silly question: I use Mac, I tried "command+arrow" to navigate in the image to show the spectrum but it did not work.
no errors or warnings
Thomas Aarholt
@thomasaarholt
need access to see it. Requested.
@DrYGuo
Yueming Guo
@DrYGuo
Thanks! I just sent the permission. Are you able to open it now?
Thomas Aarholt
@thomasaarholt
Ah! I take it that the problem is that the spectrum doesn't change when you try to navigate?
The reason that happens is that jupyter notebook defaults to a non-interactive "backend" for matplotlib.
Thomas Aarholt
@thomasaarholt
Now, in regular jupyter notebooks we fix this by calling %matplotlib notebook or (if one has installed pip install ipympl) %matplotlib widget. These are interactive backends that are displayed in the notebook but let you move around interactively.
Unfortunately, this doesn't seem to be working in (the otherwise very impressive) google colab environment.
When I call the former, there is just no output.
The attached video shows how it should look with a simple example.
Yueming Guo
@DrYGuo
Hi Thomas, it is working in local computer but not on google colab. I am trying to figure out why it is the case
pquinn-dls
@pquinn-dls
Sorry for being off topic- but we're looking for someone to do data analysis and development across electron and x-ray imaging using hyperspy and related tools. If you could please forward to anyone you think might be interested:
https://vacancies.diamond.ac.uk/vacancy/data-analysis-scientist-xray-and-electron-scanning-microscopy-391631.html
Kim132
@Kim132
i was trying to use Line2DROI for interactive() but it does not work. jupyter notebook says 'data type "void24" not understood' when i type these. How can i solve this error?
image.png
김선제
@muytjswp87_gitlab
@Kim132 i am Kim132 with different ID.
my image works super well withRectangularROI) and point2DROI(). only line2DROI does not work
Thomas Aarholt
@thomasaarholt
@Kim132 @muytjswp87_gitlab Working fine here with hyperspy 1.5.2:
image.png
s2 = hs.datasets.example_signals.object_hologram()
line1 = hs.roi.Line2DROI(200, 200, 400, 400)
s2.plot() 
r = line1.interactive(s2, color='red')
Thomas Aarholt
@thomasaarholt
I just checked, and it also works on latest master.
Joshua Taillon
@jat255
has anyone played around with plotting the output of a digital micrograph FFT?
HyperSpy loads it as a complex2D signal
plotting it like so:
image.png
whereas DM shows:
image.png
Joshua Taillon
@jat255
figured it out
DM displays the log of the modulus of complex values by default, with bilinear interpolation
there may be some value in making such a display the default for power spectra