francisco-dlp on RELEASE_next_minor
Update DOI (compare)
francisco-dlp on v1.6.0
francisco-dlp on RELEASE_next_minor
Update CHANGES.rst Merge remote-tracking branch 'u… Add cluster analysis and 21 more (compare)
on windows 7, after updating all packages using conda, I have the following error while loading hyperspy ---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-3-d7e81a299c63> in <module>
12 get_ipython().run_line_magic('matplotlib', 'widget')
---> 14 import hyperspy.api as hs
15 import numpy as np
~\Anaconda3\lib\site-packages\hyperspy\api.py in <module>
----> 1 from hyperspy.api_nogui import *
2 import logging
3 _logger = logging.getLogger(name)
5 doc = hyperspy.api_nogui.doc
~\Anaconda3\lib\site-packages\hyperspy\api_nogui.py in <module>
12 from hyperspy.utils import *
13 from hyperspy.io import load
---> 14 from hyperspy import signals
15 from hyperspy.Release import version as version
16 from hyperspy import docstrings
~\Anaconda3\lib\site-packages\hyperspy\signals.py in <module>
48 _g[_signal] = getattr(
---> 50 _specs["module"]), _signal)
52 del importlib
~\Anaconda3\lib\importlib__init__.py in import_module(name, package)
126 level += 1
--> 127 return _bootstrap._gcd_import(name[level:], package, level)
~\Anaconda3\lib\site-packages\hyperspy_signals\signal2d.py in <module>
24 import logging
25 from scipy.fftpack import fftn, ifftn
---> 26 from skimage.feature.register_translation import _upsampled_dft
28 from hyperspy.defaults_parser import preferences
ModuleNotFoundError: No module named 'skimage.feature.register_translation'
Hi, I am a new user of hyperspy and I am specifically learning the EELS Quantification module. I was following the tutorial and I am facing the following issues:
s = hs.datasets.eelsdb(title="Hexagonal Boron Nitride",
ll = hs.datasets.eelsdb(title="Hexagonal Boron Nitride",
m = s.create_model(ll=ll)
m.plot(plot_components=True) # It plots all the components, how can I extract individual components here?
edges = ("N_K","B_K")
hs.plot.plot_spectra([m[edge].intensity.as_signal("std") for edge in edges], legend=edges)
Could anyone please help me understand this ?
set_signal_range(x11=None, x12=None, x21= None, x22 = None)or I set the range with a bool array of the right shape:
set_signal_range(mask=None). This former choice presenting the advantage of being coherent with 1D implementation, the latter should be generalisable to ND which is an other advantage. So I was wondering if this discussion had been raised in 1D, and how the community would feel about such change of paradigm. Maybe there is a way to extend the function to
set_signal_range(self, x1=None, x2=None,mask=None), where passing a mask would override
x2but preserve their use for backward compatibility?
decompositionwith the mlpca algortihm on a 80*80 (pixels) times 1980 (energy channels) EDX dataset. I let it run for more than 3 hours and it did not converged. My computer has a i7-8750H 2.2 Ghz CPU and 32 Go of RAM. I know that mlpca is computationally heavy but I thought that it takes very long given the dataset size. Is that normal behavior or is there an issue here ?
import hyperspy.api as hs; hs.set_log_level("INFO")will give you a sense of how fast each iteration runs. There's not a lot of logging but there is some.
hs.set_log_level("INFO")thanks a lot. From my perspective, it would be nice to have this feature easier to reach when using the
decomposition(maybe as an argument of the function ?? ). Improving the quantity and content of the displayed infos would also be a nice addition.
print_info=True, which will output a little bit more. In general there's not a lot more you can print from
decomposition- many of the algorithms don't have a lot to log. That said some of them do, e.g. NMF takes a
verboseargument. You can also pass your own sklearn estimators as the
algorithmargument in the latest code - they sometimes take a