import hyperspy.api as hs
import numpy as np
data = np.zeros((63, 51, 400, 975))
s = hs.signals.Signal2D(data)
s
# <Signal2D, title: , dimensions: (51, 63|975, 400)>
s.isig[2:4, 2:3]
# <Signal2D, title: , dimensions: (51, 63|2, 1)>
import hyperspy.api as hs
import numpy as np
data = np.zeros((63, 51, 400, 975))
ax1 = {'size' : 63}
ax2 = {'size' : 51}
ax3 = {'size' : 400}
ax4 = {'size' : 975}
s = hs.signals.Signal2D(data, axes=[ax2, ax1, ax3, ax4])
s
# <Signal2D, title: , dimensions: (63, 51|975, 400)>
s.isig[1:2, 3:4]
# <Signal2D, title: , dimensions: (51, 63|1, 1)>
a_in_axes_dict
and en_in_axis_dict
Here I fed ax1 and ax2 in the wrong order. hyperspy accepts this, but then updates it when slicing the signal.
Thank you Thomas! I indeed found an error in the sizes of the input axis dictionaries.
This fixed my problem.
Perhaps a quality of life improvement could be an assert statement for the correct axis sizes in the signal init method? Since I had to define them manually (I had the axes as vectors) it is quite easy to mistype some value or specify the axes in the wrong order.
Hi everyone, I am not sure if this is the right place to post this but I encounter an error when Atomap is refining Gaussians on a second sublattice of a nanoparticle. This error repeats many times over, but the Gaussian fitting progresses to the end.
WARNING:hyperspy.model:Covariance of the parameters could not be estimated. Estimated parameter standard deviations will be np.nan.
WARNING:hyperspy.model:
m.fit()did not exit successfully. Reason: Number of calls to function has reached maxfev = 5000.'
It works perfectly fine for the first sublattice despite it having more atom columns.
The error warning comes in at about 30% of the Gaussian refinement. Initially I thought it may be the number of atoms and so I cropped the image to a few rows and the same thing still happens. I then attempted to change the maxfev value in the atom_finding_refining script to 5000. Here, the error persists only with an updated value of maxfev = 5000 being reached.
Your help in anyway is much appreciated,
Thanks
pip install --upgrade atomap==0.2.1
.
I have imported a Raman mapscan to Hyperspy as a Signal1D, with the x-y directions of the map as the navigation axes. This is the code I used:
dict0 = {'name':'x','size':len(data_y)}
dict1 = {'name':'y','size':len(data_y[0])}
dict2 = {'name':'intensity','size':len(data_y[0][0])}
s_hs = hs.signals.Signal1D(np.array(data_y),axes=[dict0,dict1,dict2])
factor = len(s_hs.axes_manager['intensity'].axis) / ( max(data_x[0][0]) - min(data_x[0][0]) )
s_hs.axes_manager['intensity'].scale = 1/factor
s_hs.axes_manager['intensity'].offset = min(data_x[0][0])
where data_y is an array containing an array of rows in x axis, each row being a set of arrays each of which is a Raman spectrum (the intensity). data_x has this same structure, but each pixel is an array of wavenumber shifts corresponding to each data point in data_y.
I was able to load the data into Hyperspy and also get the wavenumber axis scaled properly. I have two questions and would be very grateful for assistance. Firstly, is there a better way to synchronise the x axis with the data? Or is the way I've done it fine?
Secondly, when I try to go to a specific region of navigation or signal space, i.e. s_test = s_hs.inav[5,5] the x axis data isn't carried along and the new s_test object will have an x axis which is just integers from 0 to the size of the data set - is there a way to get around this? This latter question is my main concern