Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 20 15:12
    skoudoro locked #2500
  • Jan 20 15:08
    foxet closed #2500
  • Jan 20 10:39
    ShreyasFadnavis commented #2500
  • Jan 20 04:40
    foxet commented #2500
  • Jan 20 01:42
    foxet edited #2500
  • Jan 20 01:41
    foxet opened #2500
  • Jan 19 16:10
    skoudoro edited #2499
  • Jan 19 15:10
    itellaetxe edited #2499
  • Jan 19 15:10
    itellaetxe edited #2499
  • Jan 19 15:09
    itellaetxe edited #2499
  • Jan 19 15:08
    itellaetxe edited #2499
  • Jan 19 15:08
    itellaetxe edited #2499
  • Jan 19 15:07
    itellaetxe edited #2499
  • Jan 19 15:04
    itellaetxe opened #2499
  • Jan 12 19:40
    alexrockhill commented #2490
  • Jan 12 19:40
    alexrockhill closed #2490
  • Jan 12 16:55
    alexrockhill commented #2490
  • Jan 12 16:54
    alexrockhill commented #2490
  • Jan 12 07:19
    alexrockhill commented #2490
  • Jan 11 21:40
    alexrockhill commented #2490
tecork
@tecork

Hello everyone,
I am trying to run the following code:

from dipy.denoise.localpca import localpca
from dipy.denoise.pca_noise_estimate import pca_noise_estimate
from dipy.core.gradients import gradient_table

gtab = gradient_table(bvals, bvecs)
sigma = pca_noise_estimate(data, gtab, correct_bias=True, smooth=3)
denoised_arr = localpca(data, sigma, tau_factor=2.3, patch_radius=2)

using a dataset that is (100, 128, 3, 21) with the following bvals setup:
array([ 0., 1000., 1000., 1000., 1000., 1000., 1000., 0., 1000.,
1000., 1000., 1000., 1000., 1000., 0., 1000., 1000., 1000.,
1000., 1000., 1000.])
I keep receiving this error when I run though:

.../opt/anaconda3/lib/python3.7/site-packages/dipy/denoise/localpca.py:246: RuntimeWarning: invalid value encountered in true_divide
denoised_arr = thetax / theta

The resulting array is all np.nan's.
Any idea on how to go about this?

tecork
@tecork
Patch2Self and Non-Local Means techniques work on the same data. Local PCA via empirical thresholds and Marcenko-Pastur PCA algorithm techniques throw the same error for the said dataset.
Shreyas Fadnavis
@ShreyasFadnavis
Hey @tecork ! thanks for reaching out :) I guess I know why this error is coming. I have never encountered it personally. Can you share some data that you are getting this error with?
2 replies
Serge Koudoro
@skoudoro
Also, it will be good to create an issue concerning this warning/error. thanks @tecork
@ShreyasFadnavis Here is the dataset I was working with. Not a very impressive dataset, but I wanted to workout the denoising tools for a project I'm working on!
@skoudoro I'll create and issue as well. I just wanted to run it through the community first because I was suspecting it could very well be a user error
araikes
@araikes
@skoudoro @Garyfallidis: Two questions:
  1. Is there a way to use the tensor output from dipy_fit_dti or dipy_fit_dki to create scalar maps in a subsequent step after reorienting to a T1w image?
  2. Can I fit the RESTORE model using the CLI?
Serge Koudoro
@skoudoro
Hi @araikes, concerning your question number 2, You can not use RESTORE model using CLI. However, this is something easy to add. Can you create an issue and should be able to add it before the release in november 7-8.
Concerning your question 1, I will let @Garyfallidis answer
araikes
@araikes
@skoudoro Issue added.
@Garyfallidis: I'll clarify my question since it isn't as obvious what I'm thinking now that I'm reading it. If I use ANTs to register the the b0 to a T1w or T2w image (imaging a lot of rodents...) and then reorient the tensors (https://github.com/ANTsX/ANTs/wiki/Warp-and-reorient-a-diffusion-tensor-image), is there a DIPY way to then get the scalar maps from those reoriented tensors?
Eleftherios Garyfallidis
@Garyfallidis
@araikes in what form are the reoriented tensors saved?
araikes
@araikes
@Garyfallidis They're a NIFTI. DIPY's tensor is NIFTI-1, so it feeds directly into ANTs ReorientTensor without any manipulation.
Eleftherios Garyfallidis
@Garyfallidis
Okay can you load them back in DIPY? If yes then you can use the our dipy.reconst.dti module in a predictive way.
The question is what you want to do next.
Do you want to use these tensors to generate metrics such as FA etc?
araikes
@araikes
That's the plan (at the moment)
Eleftherios Garyfallidis
@Garyfallidis
Here is how to decompose the tensor https://github.com/dipy/dipy/blob/master/dipy/reconst/dti.py#L1960 into eigen values and eigen vectors.
Then you can use those to create FA etc. See function to use here https://github.com/dipy/dipy/blob/master/dipy/reconst/dti.py#L54
araikes
@araikes
Makes sense. I'll see if I can get something up and working. Thanks
Eleftherios Garyfallidis
@Garyfallidis
You are welcome.
araikes
@araikes
@Garyfallidis It looks like I can't load them back in DIPY. If I use load_nifti (even on the unmodified tensor image produced by dipy_fit_dti) and then dti.decompose_tensor I get: LinAlgError: Last 2 dimensions of the array must be square
araikes
@araikes
data,affine = load_nifti('tensors.nii.gz')

test = dti.from_lower_triangular(data)

evals, evecs = dti.decompose_tensor(test)
fa = dti.fractional_anisotropy(evals)
That produces an FA map of 0s. The tensor data when read back in is a 64x128x64x1x6 array of 0s.
Eleftherios Garyfallidis
@Garyfallidis
Not sure if that will work but can you remove the extra dimensions? Use np.squeeze? To go to 64x128x64x6?
licataae
@licataae

Hi all, I am working with a script that generates peaks using dipy's peaks_from_model() to track white matter pathways. I am attempting to save these peaks to a nifti from the .PAM5 file they are saved in using save_peaks(). My PeaksandMetrics object does not have the affine attribute, however even if I specify it in save_peaks() as the docs suggest it still fails to recognize it and gives the error: AttributeError: 'PeaksAndMetrics' object has no attribute 'affine'. I am using dipy 0.15, python 2.7 . Here is my code, any advice is helpful since I am still fairly new to dipy:
`csapeaks = peaks_from_model(model=csa_model,
data=maskdata,
sphere=sphere,
relative_peak_threshold=.25,
min_separation_angle=40, mask=mask,
return_odf=True, normalize_peaks=True)

print('csa_peaks generated')
pam = savepeaks(os.path.join(Diffusion, 'peaks.pam5'), csapeaks, affine=np.eye(4))

peaks_toniftis(pam, Diffusion+'/'+PIDN+'peaksSH.nii',
Diffusion
+'/'+PIDN+'peaksdirections.nii',
Diffusion
+'/'+PIDN+'peaksindices.nii',
Diffusion
+'/'+PIDN+'peaksvalues.nii',
Diffusion
+'/'+PIDN+'GFA.nii', reshape_dirs=False)`

Serge Koudoro
@skoudoro
Hi @licataae, Sorry for the late answer. I would recommend switching to python 3 and a recent version of DIPY. Indeed, python2.7 is deprecated and this issue has been fixed quite a long time ago. However, If you have really no choice, I would recommend looking at the current codebase where we updated the save_peaks function: https://github.com/dipy/dipy/blob/master/dipy/io/peaks.py. it might help you a lot to rewrite the save function.
licataae
@licataae
Thank you very much! Yes I must update my python/dipy versions... I greatly appreciate your help.
kenebene
@kenebene

Hi, a question for the community: I would need to cluster streamline pairs rather than streamlines whilst still using QuickBundles. By this I mean that I have one streamline pair [A B], where A and B are individual streamlines, and one streamline pair [C D], where C and D are individual streamlines. What I want to do is calculate the distance between A and C, and B and D and cluster based on the total distance between the pairs as: totalDistance = distanceBetween(A, C) +distanceBetween(B, D).

If anyone has any idea of how to achieve this I would be very thankful to hear it!

erickirby12
@erickirby12
Hello, I'm still newer to DIPY and coding in general. How would one go about creating an average whole brain tractogram in dipy that is a combination of all subjects' datasets in a group? My current idea is combining all bvec and bval files, transforming all raw dti data to standard space, merge all dti data, then put the resulting file through my dipy pipeline like it was a single subjects data. However, this will create a massive file and I don't think my computer can handle it. Any ideas on a better way to do this?
Basically you will need to design properly your feature and metric distance. But I think what you want to do is possible.
kenebene
@kenebene
@Garyfallidis thank you specifically for this input and generally for your great work, much appreciated. I will work with the material you suggested.
Elie Abi Aoun
@Elie-AAA
Hello everyone,
has anyone tried DIPY for fibre tracking in a fibre reinforced composite? If yes, was it successful?
Thanks in advance,
Elie
Surendra Maharjan
@surendra116083
Hello, Thank you very much for designing this awesome package.
irfnt
@irfnt
Do we have FFT in DIPY?
Serge Koudoro
@skoudoro
hi @irfnt, there is not FFT in DIPY, we use scipy or numpy fft implementation usually
Serge Koudoro
@skoudoro

Dear DIPY Community (@/all),

Do you need help to add a new feature on DIPY?

Do you want to share ideas or a moment with DIPY users and developers? or Do you have any questions about your DIPY code?

Feel free to join us on December 16th-17th during our Brainhack event! (https://brainhack.luddy.indiana.edu/)

Whatever your background or level of expertise is, you are encouraged to join us and participate: propose and discuss ideas, showcase demos, or contribute to activities initiated by others.

This free online event will be a good opportunity to connect with each other (registration is mandatory).
.
Some users already have proposed projects that you might be interested in:

  • Add new features to image registration
  • Sphinx Gallery Integration in DIPY
  • Improve Patch2self
  • Sprint to fix issues on DIPY
  • Implementation of the generalized volumetric atlas-based method for tractogram segmentation
  • Convert python 2 dipy script to python 3
  • Your project!

You will find all the information (schedule, registration, ...) at https://brainhack.luddy.indiana.edu/

Chandana Kodiweera
@kodiweera
Postdoctoral position available to work on A2CPS DWI data: https://twitter.com/fMRIstats/status/1480974031855652864.
Please spread the word. Thank you.