import hyperspy.api as hs
from hyperspy.axes import UniformDataAxis
import dask.array as da
from hyperspy.datasets.example_signals import EDS_SEM_Spectrum
from hyperspy._signals.eds_sem import LazyEDSSEMSpectrum
from hyperspy._signals.signal2d import LazySignal2D
s = EDS_SEM_Spectrum()
data = s.data
axis = UniformDataAxis(offset = -0.1, scale = 0.01, size = 1024, units="eV")
s2 = LazyEDSSEMSpectrum(data, axes = [axis])
s2.add_elements(s.metadata.Sample.elements)
s2.set_microscope_parameters(beam_energy=10.)
nav = LazySignal2D(da.random.random((2500, 1000)))
s = s2 * nav.T
print("Shape:", s.data.shape) # 2500, 1000, 1024 - ~20GB
s.save("lazy.hspy", compression=None, overwrite=True, chunks = (100, 1000, 1024))
conda create --name testdask hyperspy
results in an error? I'm running mamba on my M1 Mac, and installing hyperspy is giving a really weird error today. Installing jupyter notebook works fine:
(base) ➜ ~ conda create --name testdask hyperspy
Collecting package metadata (current_repodata.json): done
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: |
Found conflicts! Looking for incompatible packages.
This can take several minutes. Press CTRL-C to abort.
failed
UnsatisfiableError:
(No info after the UnsatisfiableError)
decomposition
function and use our estimator with the algorithm =
key word argument .fit
method takes some arguments and they are not included in the parameters of the estimator object itself because it depends on the fitted data. (For example, but not limited to that, for graph regularization a 2D shape input is needed.)decomposition
takes **kwargs they would be passed to the fit
function of the algorithm object but it seems to me that I was wrong.si.decomposition(alogrithm='ORPCA')
to return the variance? I ran standard SVD the ORPC and am receiving the following:---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-99-659ba5d350c0> in <module>
----> 1 si.plot_explained_variance_ratio(threshold=4, xaxis_type='number')
c:\users\owner\documents\github\hyperspy\hyperspy\learn\mva.py in plot_explained_variance_ratio(self, n, log, threshold, hline, vline, xaxis_type, xaxis_labeling, signal_fmt, noise_fmt, fig, ax, **kwargs)
1427
1428 """
-> 1429 s = self.get_explained_variance_ratio()
1430
1431 n_max = len(self.learning_results.explained_variance_ratio)
c:\users\owner\documents\github\hyperspy\hyperspy\learn\mva.py in get_explained_variance_ratio(self)
1311 target = self.learning_results
1312 if target.explained_variance_ratio is None:
-> 1313 raise AttributeError(
1314 "The explained_variance_ratio attribute is "
1315 "`None`, did you forget to perform a PCA "
AttributeError: The explained_variance_ratio attribute is `None`, did you forget to perform a PCA decomposition?
import hyperspy.api as hs
import h5py
import dask
import dask.array as da
f = h5py.File('data_path.hdf5', 'r')
edx_da = da.from_array(f['data_edx'], chunks=[1,128,128,4096])
f.close()
edx_hs = hs.signals.EDSTEMSpectrum(edx_da).as_lazy()
# <LazyEDSTEMSpectrum, title: , dimensions: (512, 512, 98|4096)>
test = edx_hs.inav[:,:,0]
test.compute()
ValueError: Not a dataset (not a dataset)
In:
quant_elms = ['O_K', 'Si_K', 'Sc_K', 'Er_L', 'Nd_L', 'Yb_L', 'Lu_L']
quant_sigs = [result[i] for i in [10, -3, -5, 1, 8, -2, 4]]
quant_K = np.array([1.0, 0.885161, 0.98704, 1.45108, 1.61348, 1.6705, 1.69536])
print('Intensities')
print(pd.DataFrame(hs.stack(quant_sigs).data.T, columns=quant_elms))
quant = sroi_stack.quantification(quant_sigs, method='CL', factors=1/quant_K, composition_units='weight', max_iterations=100)
print('Wt%')
print(pd.DataFrame(hs.stack(quant).data.T, columns=quant_elms))
Out:
Intensities
[########################################] | 100% Completed | 0.1s
O_K Si_K Sc_K Er_L Nd_L Yb_L Lu_L
0 10.096292 10.968796 0.046850 0.051064 0.015106 0.048460 0.038284
1 6.665914 5.675549 2.449875 3.110012 1.423448 2.923658 2.596810
2 6.598061 5.350703 2.364560 3.020911 1.404513 2.716718 2.388128
3 7.454355 5.216240 0.247508 1.743997 7.806576 0.876016 0.638009
Wt%
[########################################] | 100% Completed | 0.1s
O_K Si_K Sc_K Er_L Nd_L Yb_L Lu_L
0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
1 0.0 0.0 0.0 0.0 0.0 0.0 0.0
2 0.0 0.0 0.0 0.0 0.0 0.0 0.0
3 0.0 0.0 0.0 0.0 0.0 0.0 0.0
In:
weight_percent = [sroi_stack.inav[i].quantification([sig.inav[i] for sig in quant_sigs],
method='CL', factors=quant_K, composition_units='weight', max_iterations=100, convergence_criterion=0.001)
for i in range(0,4)]
print(pd.DataFrame(np.array([[j.data[0] for j in i] for i in weight_percent]), columns=quant_elms))
Out:
O_K Si_K Sc_K Er_L Nd_L Yb_L Lu_L
0 50.240269 48.313808 0.230111 0.368721 0.121285 0.402830 0.322976
1 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
2 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
3 24.858340 15.397206 0.814680 8.439159 42.003570 4.880007 3.607039
@k8macarthur Yes those are good checks.
I have checked the metadata and all edges are included:
Acquisition_instrument
TEM
Detector
EDS
azimuth_angle = 0.0
elevation_angle = 35.0
energy_resolution_MnKa = 131.45115116593425
number_of_frames = 58
Stage
tilt_alpha = 0.005
tilt_beta = -0.0
x = -0.000321
y = -8.1e-05
z = -2.5e-05
beam_energy = 200.0
camera_length = 73.0
magnification = 20000.0
microscope = Titan
General
date = 2021-08-18
original_filename = 1032 SI 23500 x HAADF-BF_original.emd
time = 10:32:08-04:00
time_zone = Eastern Daylight Time
title = Stack of EDS
Sample
elements = ['C', 'Er', 'Ga', 'Lu', 'Mo', 'N', 'Nd', 'O', 'Pt', 'Sc', 'Si', 'Yb']
xray_lines <list>
[0] = C_Ka
[1] = Er_La
[10] = Nd_Lb1
[11] = Nd_Lb2
[12] = Nd_Lb3
[13] = Nd_Lg1
[14] = Nd_Ma
[15] = Nd_Mg
[16] = Nd_Mz
[17] = O_Ka
[18] = Pt_Ma
[19] = Sc_Ka
[2] = Er_Mb
[20] = Sc_Kb
[21] = Si_Ka
[22] = Yb_La
[23] = Yb_Ma
[24] = Yb_Mb
[3] = Er_Mz
[4] = Ga_La
[5] = Lu_La
[6] = Lu_Ma
[7] = Mo_La
[8] = N_Ka
[9] = Nd_La
Signal
binned = True
signal_type = EDS_TEM
I have not yet tried the zeta or cross_section quantifications, but I did try feeding directly into from hyperspy.misc.eds.utils import quantification_cliff_lorimer
as suggested by @thomasaarholt. Using the util directly gave results for all navigation indices. There must be something going wrong in quant when the attributes are pulled from the signal.
I am trying to register a new signal type. I have done as it is written here : (https://hyperspy.readthedocs.io/en/latest/dev_guide/writing_extensions.html) I have added an entry point in my setup.py and I have created a hyperspy_extension.yaml (in the same folder of setup.py) with all the required info. But when I use hs.print_known_signal_types()
, my new signal does not appear.
I found a way to overcome this problem by puting
import hyperspy.extensions as e
e.ALL_EXTENSIONS["signals"]["MySignal"] = {'signal_type': 'MySignal',
'signal_dimension': 1,
'dtype': 'real',
'lazy': False,
'module': 'mymodule.datasets.signal'}
in the signal.py file.
What am i doing wrong ?