## Where communities thrive

• Join over 1.5M+ people
• Join over 100K+ communities
• Free without limits
##### Activity
• Aug 15 04:17
pjsjongsung commented #2626
• Aug 15 04:13
pep8speaks commented #2626
• Aug 15 04:13
pjsjongsung synchronize #2626
• Aug 11 18:33
pjsjongsung commented #2626
• Aug 11 18:23
pjsjongsung commented #2626
• Jul 27 22:03
pep8speaks commented #2626
• Jul 27 22:03
pjsjongsung synchronize #2626
• Jul 27 16:35
codecov[bot] commented #2631
• Jul 27 15:53
skoudoro synchronize #2631
• Jul 26 16:24
hookeba commented #2627
• Jul 26 13:47
codecov[bot] commented #2630
• Jul 26 13:09
skoudoro edited #2625
• Jul 26 13:08
skoudoro labeled #2631
• Jul 26 13:08
skoudoro opened #2631
• Jul 26 13:01
skoudoro edited #2625
• Jul 26 12:59
skoudoro labeled #2630
• Jul 26 12:59
skoudoro opened #2630
• Jul 26 03:27
skoudoro review_requested #2629
• Jul 25 19:39
codecov[bot] commented #2629
• Jul 25 18:47
skoudoro labeled #2629
Ariel Rokem
@arokem
But it looks like you've solved that already. Let me see if I can address the other issues:
Regarding the 7 vs. 14 directions: Don't you get the same ADC value for the two vectors representing the same direction?
I think that should be the case (and should incorporate information from both b-values, although you should be careful with that, especially if the higher b-value is higher than b=1,000, because the signal is probably not a Gaussian in many places.
Ariel Rokem
@arokem
I am not sure that I follow the question about the design matrix. In case it helps, take a look at equations 6 - 9 in this paper: https://www.sciencedirect.com/science/article/pii/S1053811906007403.
jarcosh
@jarcosh
@arokem I feel silly now: you are completely correct that the ADC values for the two vectors representing the same direction are exactly the same. Being happy the voxels values now seemed reasonable/annoyed that the output still seemed in the wrong dimensions and having other issues to solve I didn't compare ROIs thoroughly as I usually do and didn't notice.
So moved on to something else and noticed just now. Will compare output from our other software to dipy values but as far as using dipy correctly goes I managed thanks to your advice. Your reference seems a better explainer than the one I linked, so thanks for that too, will read carefully edit: (and yes, the formula 9 clarifies how it actually works/how dipy manages and clears my confusion there too)
rosella1234
@rosella1234
@arokem thank you! Should I replace NaNs with something else in image matrix, then?
Steven Meisler
@smeisler
I am have a tck tracogram and want to use voxel2streamline to see which streamlines pass through which voxels. I feed in the streamlines and the inverse of the DWI affine to map from streamline to voxel. I am having trouble interpreting the output, as I do not know how to map voxel index to location. Any guidance would be appreciated, thanks!
Steven Meisler
@smeisler
it appears dipy.tracking.vox2track.streamline_mapping is better suited for my needs
willi3by
@willi3by
Thank you @Garyfallidis ! I will start the discussion
Eleftherios Garyfallidis
@Garyfallidis

Hi Everyone! We are very happy to announce that a new educational course is available online: https://youtube.com/playlist?list=PLRZ9VSqV-6soOC0rUEAOV-QiSa_Qxk8JM :rocket: This is a complete course on Diffusion MRI with Theory and Practice :snake: This is your chance to go from zero-to-hero in using :star: DIPY :star: Topics include -
:white_check_mark: dMRI Basics
:white_check_mark: Microstructure Modeling
:white_check_mark: Denoising
:white_check_mark: Tractometry
:white_check_mark: Dictionary Learning
:white_check_mark: Registration
:white_check_mark: Tissue Classification
:white_check_mark: Deep Learning methodologies in MRI
:white_check_mark: High Field MRI
:white_check_mark: GPU Acceleration
:white_check_mark: Data Harmonization
:white_check_mark: ML in the Clinic and much more.....

Do not miss out on this golden opportunity! If you do like and use DIPY please do give us a star and cite the piece of software that you are using! It encourages us to keep going! May your data live a new life!!

Oliver Li
@motivationss
Hi everyone! I have an issue when trying to transform streamlines into the original 3D coordinates. I have found several functions to transform the 3D coordinates to streamlines but not vice versa. I am inquiring if we have one function for such purpose in dipy or not.
Eleftherios Garyfallidis
@Garyfallidis
@motivationss are you using the Stateful Tractogram?
Hello everyone. I am having an error on dti with nifti images. Here is the source code.
fbval = f'{file2_path}.bval'
fbvec = f'{file2_path}.bvec'
                    data, affine, img = load_nifti(file2_path, return_img=True)

print(bvals)

print(gtab)

tenmodel = TensorModel(gtab)

print(len(bvals))

new_img = img.get_fdata()

tenfit = tenmodel.fit(new_img)
The error I am getting is this
Traceback (most recent call last):
File "main.py", line 197, in <module>
convert(args.f1t, args.f2t, args.f1p, args.f2p, args.s, args.diff)
File "main.py", line 171, in convert
tenfit = tenmodel.fit(new_img)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/dipy/reconst/dti.py", line 793, in fit
self.kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/dipy/reconst/dti.py", line 1303, in wrapped_fit_tensor
*args,
kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/dipy/reconst/dti.py", line 1381, in wls_fit_tensor
w = np.exp(np.einsum('...ij,...j', ols_fit, log_s))
File "<__array_function__ internals>", line 6, in einsum
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/numpy/core/einsumfunc.py", line 1356, in einsum
return c_einsum(operands, *kwargs)
ValueError: operands could not be broadcast together with remapped shapes [original->remapped]: (10,10)->(10,10) (10000,11)->(10000,newaxis,11)
I have tried everything including this link right here, dipy/dipy#1790, but it still does not work for me
Oliver Li
@motivationss

@motivationss are you using the Stateful Tractogram?

Thanks for your reply. No, I did not use the Stateful Tractogram and I generated streamlines from fiber tracking.

Ajay Bhargava
I have registered two volumes using dipy.align._public.SymmetricDiffeomorphicRegistration. Using the .update() method, I have interpolated the displacement to a new 3D shape of (S,R,C,3). I would now like to split this 3D array into a list of 2D arrays and then apply each deformation field on a new 2D image. Is there a way of doing that?
Ajay Bhargava
Specifically, is there a way of applying a deformation field from the map.forward() attribute on a new image? I want to save these deformation fields for later use.
@portokalh
anybody can suggest how to move fwds past this error message when using dipy_horizon: "TypeError: buffer is too small for requested array"? thanks much! alex
Eleftherios Garyfallidis
@Garyfallidis
@portokalh what was the input? And how did you run the command?
Eleftherios Garyfallidis
@Garyfallidis
@AtulPhadke can you print data.shape and new_img.shape?
araikes
@araikes
Is there a CLI approach where I could use an existing mask to crop an image? dipy_median_otsu has the --autocrop option but that computes another mask potentially unnecessarily.
I have a question, If I have multishell data and going to fit my data to tensor model, how can I select bval = 0 and 1000 of my data.
Hi @mohammadhadiarabi -- Great question! This would work via changing the bvals and bvecs you pass to the gtab object when you fit the data with TensorModel as follows:
# these are your original bvals and bvecs
bvals = gtab.bvals
bvecs = gtab.bvecs

# here we select only 0 and 1000 bvals
selelected_bvals = np.logical_or(bvals == 0, bvals == 1000)

# here we pick only the 3D volumes from the 4D dMRI data pertaining to 0 and 1000 that we did above.
data_selected = data[:, :, :, selelected_bvals]

# now we also take only the bvals and bvecs of the same 0 and 1000 shells.

# now you can use these to fit the DTI model
tenmodel = dti.TensorModel(gtab_selected)
tenfit = tenmodel.fit(data_selected)
Eleftherios Garyfallidis
@Garyfallidis
@araikes I do not think we have such a CLI. Can you write an issue? Be happy to start a PR too (if you can).
@ShreyasFadnavis I have another question. How can I save selected new bval, bvec files as .bval and bvec?
@bgolshaei
Hi, I would like to have a strain field instead of latics from image registration, Do you have any example for this?
import numpy as np; np.savetxt('bval_filename.bval', bvals); np.savetxt('bvec_filename.bvec, bvecs)
jarcosh
@jarcosh
hi, I was trying to get the residuals of my data fit on dipy.reconst.dti, to derive R^2 fit for each voxel with a separate homemade function. I'm using fit_method = 'NLLS' so I checked _nlls_err_func. I keep getting ValueError: shapes not aligned so I know I must be doing something wrong. so I have two questions: a) where should I best get the 3x3 tensor from? the tenfit object has several variables with dimensions (inputXaxis, inputYaxis, inputSlices, 3, 3) such as tenfit.evecs.T or (3, 3, inputXaxis, inputYaxis, inputslices) b) how to best use it if I simply want an array of the residuals for each voxel? I assume what I want is residuals = nlls_err_func(tensor, tenmodel.design_matrix, data) but I must be fetching something wrong
As an example of what I mean, here's a test case I tried to do with a specific voxel in the first slice to see if I could fill the array iteratively later. As far as I can tell I'm following the instructions: 3x3 matrix, design matrix, array of signal during acquisition over all acquisition directions and b values, but I get an error
Ariel Rokem
@arokem
@jarcosh: I think that you need to reshape the tensor to a vectore. You might want to take a look at how that's handled here: https://github.com/dipy/dipy/blob/master/dipy/reconst/dti.py#L1665-L1705
Chandana Kodiweera
@kodiweera
Is there a project outline for dipy such as a module connections graph?
Chandana Kodiweera
@kodiweera

Is there a project outline for dipy such as a module connections graph?

Such as these I created for fmriprep and qsiprep: https://github.com/nipreps/fmriprep/discussions/2517 and https://github.com/PennLINC/qsiprep/discussions/289

jarcosh
@jarcosh
@arokem thanks for your quick response, reading the docstring for that function it did not occur to me that the tensor matrix ought to be flattened. however no matter if I use np.matrix.flatten to turn the 3x3 matrix into a 9, vector or tenfit.quadratic_form.flat I still get value errors at the y = np.exp(np.dot(design_matrix, tensor)) step. my design_matrix is (16,7) shaped, taking b values and directions from acquisition instead of taking one of the default spheres from dipy, might this be causing the unexpected behavior? should I transform it in some way, like how apparent_diffusion_coef uses the transposed version? apologies if the issue is my lack of mathematical knowledge but I'm at loss here
Yashvardhan Jain
@J-Yash
Hi, I'm working on symmetric diffeomorphic 3D image registration. I was wondering if there is an easy way to calculate the jacobian determinant image of the resulting registered image in DIPY?
@kodiweera ! That is a great question. The graph that you are mentioning typically comes from NiPype -- which I believe is how the nodes are connected for a dataflow style of computation. DIPY on the other hand is a package that provides tools for what happens inside those nodes and therefore does not need a dataflow graph. However, I do know it is trivial to build these pipelines with DIPY and would like to know what exactly you need. Are you using vanilla DIPY to build pipelines?
@J-Yash : I believe this is possible on looking at the codebase -- https://github.com/dipy/dipy/blob/c846bf5a23b7e95343a9cf231df2653473602456/dipy/align/transforms.pyx#L52 is what you need I assume. I am not very well-versed with this module, so am going am tagging @Garyfallidis, @skoudoro and @arokem to correct me if I am wrong. They may also be able to provide more guidance on what you need!
Chandana Kodiweera
@kodiweera

@kodiweera ! That is a great question. The graph that you are mentioning typically comes from NiPype -- which I believe is how the nodes are connected for a dataflow style of computation. DIPY on the other hand is a package that provides tools for what happens inside those nodes and therefore does not need a dataflow graph. However, I do know it is trivial to build these pipelines with DIPY and would like to know what exactly you need. Are you using vanilla DIPY to build pipelines?

Not a nipype node grape. I was mentioning a modules connection graph. I alrady created them for fmri and qsiprep. they simply show the dependancy on the packages etc.