ericpre on RELEASE_next_patch
Copyright 2022 Remove extra spaces in header Remove more spaces in header and 1 more (compare)
I am using VSCode as an IDE. I use the jupyter plugin that enables jupyter notebooks to be used in the interface of VSCode. Everything was working well until recently. Now when I have a cell with :
import hyperspy.api as hs
The kernel dies. It is working well with
import hyperspy though. Is that a known issue ? Is anybody else using VSCode and the hyperspy api ?
Is there a way of adapting 'fit_component' function to work using multiple threads on a CPU? The only mention of parallel processes that I found in the docs was in SAMFire and in the map function. I have a relatively specific order of component fits (for EELS core-loss data) that I wouldn't want to change. Any advice on how to approach this would be appreciated!
(I just recently got my hands on a computer, where parallelising would save a significant amount of time)
Hello, i have a followup question for my question above.
I want to contribute to a package that already excist (The eds packages). How would i go about using my contribution? By forking the repo and including my methodes in the relevant classes or should i create my own class, and have that inherit from eds? In the latter case, i am struggeling to get the inheritance to work properly, i have written some code, but when i try to apply my method too a dataset it exclaim that EDSSpectrum does not have the attribute, should correct inheritance not fix this? Or is the only way to create my code in a forked repo of the relevant signal?
This might be a trivial question but i cant seem to find the answer. (Maybe i lack some understandig of classes)
Hi All, I am trying to open .hspy files written with
RELEASE_next_minorbranch with the released 1.6.5 version and I get the following error:
TypeError: __init__() got an unexpected keyword argument '_type'. Any ideas how to fix / work-around?
Never mind. Just reading with h5py and populating axes metadata works good enough. Thanks!
I try to get quantified EDS from a sample with many elements (Al,C,Co,Cu,Cr,Mo,O,Pt,W,Zn) with a strong peak overlap between Cr-L (0.571 eV) and O-K (0.523 eV).
One layer is expected to be CrC but appears as CrO... the quantification output 30% of O in it, which is very unexpected for several reasons. Checking the EDS spectrum manually I can see the peak shift by maybe 1 channel when seeing Cr-L instead of O-K but the model fitting doesn't.
I am using the EDS model fitting and after several hours in the documentation I come here to ask: Is there a solution to this? is there a function somewhere that can for instance estimate Cr-L from the Cr-K intensity and subtract it from the O-K signal before saving the O-K intensity? I hope this question is clear enough!
I am trying to run the following code :
import hyperspy.api as hs input_filename = "file1.emd" spim = hs.load(input_filename)[-1] # the output of the load function is a list in which the last element is an EDSTEM object. spim.change_dtype("float") spim.crop(1,70,400) spim.crop(2,0.3,19.0) spim.decomposition(True)
It ouputs :
ValueError: All the data are masked, change the mask.
It seems to me that the crop functions are at the source of this issue since when I comment them everything is fine.
I am asking here to check if I am missing something. But I will post an issue on github if not.
Hi, I'm not sure if it's just me missing something. But when I import images from Velox EMD (this particular file has 9538 HAADF frames), only the scaling from the first frame is retained in the axes_manager values. I've copied my code snippet below.
%matplotlib qt import hyperspy.api as hs import numpy as np import matplotlib.pyplot as plt import scipy import hyperspy.misc as hsm #prevent figure opening plt.ioff() #load file s = hs.load("211217/HeatedTEM/HeatedTEM.emd") print("imported") print(s.axes_manager) #format and save image with time stamp for single_image in s: ####### Failed attempt at scaling single_image.axes_manager.scale single_image.axes_manager.scale single_image.axes_manager.offset single_image.axes_manager.offset single_image.axes_manager.units single_image.axes_manager.units a = single_image.plot(colorbar=False, scalebar=True, axes_ticks=False) plt.axis('off') plt.savefig('test/image %s.png' % str(s.axes_manager.indices), bbox_inches = 'tight', pad_inches = 0.1) plt.close() #single_image.save("test/image %s.png" % str(image_stack.axes_manager.indices))
But every single_image.axis_manager contains the same value as s.axis_manager (in this case 0.37nm) so the scaling is the same for every image