dependabot[bot] on github_actions
ericpre on RELEASE_next_minor
Bump RalfG/python-wheels-manyli… Merge pull request #3069 from h… (compare)
dependabot[bot] on github_actions
Bump RalfG/python-wheels-manyli… (compare)
Is there a way of adapting 'fit_component' function to work using multiple threads on a CPU? The only mention of parallel processes that I found in the docs was in SAMFire and in the map function. I have a relatively specific order of component fits (for EELS core-loss data) that I wouldn't want to change. Any advice on how to approach this would be appreciated!
(I just recently got my hands on a computer, where parallelising would save a significant amount of time)
Hello, i have a followup question for my question above.
I want to contribute to a package that already excist (The eds packages). How would i go about using my contribution? By forking the repo and including my methodes in the relevant classes or should i create my own class, and have that inherit from eds? In the latter case, i am struggeling to get the inheritance to work properly, i have written some code, but when i try to apply my method too a dataset it exclaim that EDSSpectrum does not have the attribute, should correct inheritance not fix this? Or is the only way to create my code in a forked repo of the relevant signal?
This might be a trivial question but i cant seem to find the answer. (Maybe i lack some understandig of classes)
Hi All, I am trying to open .hspy files written with
RELEASE_next_minorbranch with the released 1.6.5 version and I get the following error:
TypeError: __init__() got an unexpected keyword argument '_type'. Any ideas how to fix / work-around?
Never mind. Just reading with h5py and populating axes metadata works good enough. Thanks!
I try to get quantified EDS from a sample with many elements (Al,C,Co,Cu,Cr,Mo,O,Pt,W,Zn) with a strong peak overlap between Cr-L (0.571 eV) and O-K (0.523 eV).
One layer is expected to be CrC but appears as CrO... the quantification output 30% of O in it, which is very unexpected for several reasons. Checking the EDS spectrum manually I can see the peak shift by maybe 1 channel when seeing Cr-L instead of O-K but the model fitting doesn't.
I am using the EDS model fitting and after several hours in the documentation I come here to ask: Is there a solution to this? is there a function somewhere that can for instance estimate Cr-L from the Cr-K intensity and subtract it from the O-K signal before saving the O-K intensity? I hope this question is clear enough!
I am trying to run the following code :
import hyperspy.api as hs input_filename = "file1.emd" spim = hs.load(input_filename)[-1] # the output of the load function is a list in which the last element is an EDSTEM object. spim.change_dtype("float") spim.crop(1,70,400) spim.crop(2,0.3,19.0) spim.decomposition(True)
It ouputs :
ValueError: All the data are masked, change the mask.
It seems to me that the crop functions are at the source of this issue since when I comment them everything is fine.
I am asking here to check if I am missing something. But I will post an issue on github if not.
Hi, I'm not sure if it's just me missing something. But when I import images from Velox EMD (this particular file has 9538 HAADF frames), only the scaling from the first frame is retained in the axes_manager values. I've copied my code snippet below.
%matplotlib qt import hyperspy.api as hs import numpy as np import matplotlib.pyplot as plt import scipy import hyperspy.misc as hsm #prevent figure opening plt.ioff() #load file s = hs.load("211217/HeatedTEM/HeatedTEM.emd") print("imported") print(s.axes_manager) #format and save image with time stamp for single_image in s: ####### Failed attempt at scaling single_image.axes_manager.scale single_image.axes_manager.scale single_image.axes_manager.offset single_image.axes_manager.offset single_image.axes_manager.units single_image.axes_manager.units a = single_image.plot(colorbar=False, scalebar=True, axes_ticks=False) plt.axis('off') plt.savefig('test/image %s.png' % str(s.axes_manager.indices), bbox_inches = 'tight', pad_inches = 0.1) plt.close() #single_image.save("test/image %s.png" % str(image_stack.axes_manager.indices))
But every single_image.axis_manager contains the same value as s.axis_manager (in this case 0.37nm) so the scaling is the same for every image
in Pycharm I had the below code that worked before with giving interactive hyperspy plots with
import matplotlib matplotlib.rcParams["backend"] = "Agg" import hyperspy.api as hs
I am getting the following error and no plots shown. Please help if possible, and let me know if you need more information.
WARNING:hyperspy_gui_traitsui:The agg matplotlib backend is not compatible with the traitsui GUI elements. For more information, read http://hyperspy.readthedocs.io/en/stable/user_guide/getting_started.html#possible-warnings-when-importing-hyperspy. WARNING:hyperspy_gui_traitsui:The traitsui GUI elements are not available.
[<EDSTEMSpectrum, title: EDS, dimensions: (|4096)>, <EDSTEMSpectrum, title: EDS, dimensions: (|4096)>, <EDSTEMSpectrum, title: EDS, dimensions: (|4096)>, <EDSTEMSpectrum, title: EDS, dimensions: (|4096)>, <Signal2D, title: x, dimensions: (|512, 512)>, <Signal2D, title: HAADF, dimensions: (|512, 512)>, <Signal2D, title: x, dimensions: (|512, 512)>, <Signal2D, title: x, dimensions: (|512, 512)>, <Signal2D, title: x, dimensions: (|512, 512)>, <Signal2D, title: x, dimensions: (|512, 512)>, <Signal2D, title: x, dimensions: (|512, 512)>, <Signal2D, title: x, dimensions: (|512, 512)>, <Signal2D, title: x, dimensions: (|512, 512)>, <Signal2D, title: x, dimensions: (|512, 512)>, <Signal2D, title: x, dimensions: (|512, 512)>, <EDSTEMSpectrum, title: EDS, dimensions: (512, 512|4096)>]
from scipy.misc import face img = face(gray=True) plt.figure() plt.imshow(img, cmap=cmap) plt.colorbar()
Hey all, I am trying to get the decomposition of a sum of a section of the frames of a EDS signal.
When I load the entire signal with sum_frames=True, the decomposition works fine. But when I load it with sum_frames=False and then sum all the frames using .sum('Time'), the resulting signal looks differently (lower intensity) and the decomposition works but the result is different (worse). What is the difference between the sum_frames=True argument and summing the frames after loading using .sum('Time')?
Also, when I try to sum a section of the frames in time (e.g. the first 20) I get the error message: ValueError: All the data are masked, change the mask.
Is there a way to avoid this error and get the decomposition over a range of time?
When I replace the test files with actual files I have on my device (same extensions), and I run the test script I get the following messages:
WARNING:hyperspy.io:Unable to infer file type from extension 'ASW'. Will attempt to load the file with the Python imaging library.
ERROR:hyperspy.io:If this file format is supported, please report this error to the HyperSpy developers.
Hi, I am using EDS models and have trouble to restore the stored ones.
I create a model m, fit it and store it, and also copy it to a EDS_model variable
#create a model using all selected elements: m = si.create_model() m.fit() m.fit_background() #Reduce the element selection to the one of interest and quantify kfactors = Assign_elements2Quant() xray_lines_selected=si.metadata.Sample.xray_lines m_int_fit = m.get_lines_intensity(xray_lines_selected) m.store() EDS_model = m
After this, the "si" is saved to a .hspy file.
If i look at the EDS_model, it is fine, I can plot it and so on.
EDS_model Out: <EDSTEMModel, title: EDX>
But then I try to load the .hspy file again, I see the model is there, with the components, but cannot restore it or plot it... why?
l = hs.load(signal_type="EDS_TEM", escape_square_brackets=(True)) l.models Out: └── a ├── components │ ├── Al_Ka │ ├── Al_Kb │ ├── C_Ka │ ├── Co_Ka │ ├── Co_Kb │ ├── Co_La │ ├── Co_Lb3 │ ├── Co_Ll │ ├── Co_Ln │ ├── Cr_Ka │ ├── Cr_Kb │ ├── Cr_La │ ├── Cr_Lb3 │ ├── Cr_Ll │ ├── Cr_Ln │ ├── Mo_Ka │ ├── Mo_Kb │ ├── Mo_La │ ├── Mo_Lb1 │ ├── Mo_Lb2 │ ├── Mo_Lb3 │ ├── Mo_Lg1 │ ├── Mo_Lg3 │ ├── Mo_Ll │ ├── Mo_Ln │ ├── O_Ka │ ├── W_La │ ├── W_Lb1 │ ├── W_Lb2 │ ├── W_Lb3 │ ├── W_Lb4 │ ├── W_Lg1 │ ├── W_Lg3 │ ├── W_Ll │ ├── W_Ln │ ├── W_M2N4 │ ├── W_M3O4 │ ├── W_M3O5 │ ├── W_Ma │ ├── W_Mb │ ├── W_Mg │ ├── W_Mz │ ├── Zn_Ka │ ├── Zn_Kb │ ├── Zn_La │ ├── Zn_Lb1 │ ├── Zn_Lb3 │ ├── Zn_Ll │ ├── Zn_Ln │ └── background_order_6 ├── date = 2022-02-08 12:09:22 └── dimensions = (96, 89|2048) Mymodel = l.models.a.restore() Traceback (most recent call last): File "C:\Users\oldo\AppData\Local\Temp/ipykernel_14272/3459143043.py", line 1, in <module> Mymodel = l.models.a.restore() File "C:\Users\oldo\.conda\envs\hspy_env\lib\site-packages\hyperspy\signal.py", line 82, in <lambda> self.restore = lambda: mm.restore(self._name) File "C:\Users\oldo\.conda\envs\hspy_env\lib\site-packages\hyperspy\signal.py", line 241, in restore return self._signal.create_model(dictionary=copy.deepcopy(d)) File "C:\Users\oldo\.conda\envs\hspy_env\lib\site-packages\hyperspy\_signals\eds_tem.py", line 745, in create_model model = EDSTEMModel(self, File "C:\Users\oldo\.conda\envs\hspy_env\lib\site-packages\hyperspy\models\edstemmodel.py", line 45, in __init__ EDSModel.__init__(self, spectrum, auto_background, auto_add_lines, File "C:\Users\oldo\.conda\envs\hspy_env\lib\site-packages\hyperspy\models\edsmodel.py", line 131, in __init__ Model1D.__init__(self, spectrum, *args, **kwargs) File "C:\Users\oldo\.conda\envs\hspy_env\lib\site-packages\hyperspy\models\model1d.py", line 279, in __init__ self._load_dictionary(dictionary) File "C:\Users\oldo\.conda\envs\hspy_env\lib\site-packages\hyperspy\model.py", line 306, in _load_dictionary id_dict.update(self[-1]._load_dictionary(comp)) File "C:\Users\oldo\.conda\envs\hspy_env\lib\site-packages\hyperspy\component.py", line 1223, in _load_dictionary raise ValueError( ValueError: _id_name of parameters in component and dictionary do not match
environment.ymlfile. However, for my analysis I need to use nonlinear functionality, which isn't included in hyperspy v1.6.5 - but cloning
RELEASE_next_minoris something which can't be tracked using an
environment.ymlfile. Does anyone have a suggestion for how to get nonlinear functionality in a reproducible way? @LMSC-NTappy proposed forking
RELEASE_next_minorto my own GitHub, pip installing the fork, and then leaving that fork unchanged - this is what I'll do for now. But I'm always grateful for other suggestions - thanks!