Serge Koudoro
@skoudoro
yes @ShreyaKapoor18, I recommend you to have a look on the tutorial, tractography / Fiber tracking section : https://dipy.org/tutorials .
Shreya Kapoor
@ShreyaKapoor18
@skoudoro thanks for your reply. I looked at that section but is the whole brain tractography accomplished using the local tracking method only? I don't really understand.
arnaudbore
@arnaudbore
pip doesnt find dipy==1.1.1 :(
Serge Koudoro
@skoudoro
Strange @arnaudbore, if you look this link https://pypi.org/project/dipy/#files dipy 1.1.1 is present for many OS
What is your environment ?
Yijun Liu
@snapfinger
Hi there, I have questions about the usage of the word “reconstruction” in dipy’s documentation. e.g. for tensor model, the tutorial’s title is “Reconstruction of the diffusion signal with the Tensor model“. Is it really “reconstruction”? Or it’s some “modeling”. Because I saw in MRI field, reconstruction usually refers to reconstruct the MR image from k-space data, but here we actually already have the images, which makes me confused.
archerdb
@archerdb
@Garyfallidis @RafaelNH Hello! Our group is very interested in moving our single-shell free-water elimination analysis from MATLAB (Pasternak code) to DIPY. Is there any estimation for when this would be available in DIPY?
Pietro Astolfi
@pietroastolfi
@arokem Hi. I'm trying to apply pyAFQ in these days. I do not apply it using the cli, but I use directly some of the defined functions. In my case I have available the tractogram coregistered to MNI space, but not the DWI data, and I m trying to apply only AFQ segmentation to the tractogram. Now, because of the lack of the dwi I m calling the function Segmentation.segment by providing reg_tamplate, img_affine, and mapping where reg_template is MNI152_T1_1mm, img_affine is the affine of this mni_t1, and mapping is the indentity mapping (since tractogram and template are already in the same space). After this premise I explain the problem: The segmentation seems to work in the first step, waypoints rois filtering, but it produces a strange behaviour in the second step, endpoints filtering, where some tracts like ATR have a huge drop in terms of number of streamlines. I looked into detail on the reason of this drop, by saving the endpoints AAL rois corresponding to the ATR_L, and I overlap it with the segmentation produced by AFQ without endpoints filtering.
Schermata 2020-04-24 alle 14.08.07.png
here what I obtain: the endpoint roi is in the wrong hemisphere.
Ariel Rokem
@arokem
@pietroastolfi : thanks for reporting. Could you please post an issue on the pyAFQ repo: https://github.com/yeatmanlab/pyAFQ/issues ?
That's a great find and might be related to things that others have seen: yeatmanlab/pyAFQ#235
My main conclusion is that this part of the processing is not yet robust enough. Works sometimes, but not bullet proof yet
So, we need to investigate
tecork
@tecork
I've been searching for a few days, but does dipy support 2D rigid registration?
should work in 2d as well
If not, in a pinch I might try adding a singleton dimension to your data just to get it in there
salomaaa
@salomaaa
hello guys i have problem with this lib
from dipy.tracking.local import LocalTracking, ThresholdTissueClassifier
the error : No module named 'dipy.tracking.local'
Shreya Kapoor
@ShreyaKapoor18
@salomaaa it's probably that these commands work with a different version than you are currently using. According to version 1.1.0 you can use the command
from dipy.tracking.local_tracking import LocalTracking
According to the api documentation https://dipy.org/documentation/1.1.0./api_changes/ , ThresholdTissueClassifier -> ThresholdStoppingCriterion
Shreya Kapoor
@ShreyaKapoor18
@salomaaa so you can do this with from dipy.tracking.stopping_criterion import ThresholdStoppingCriterion
Ariel Rokem
@arokem
@salomaaa : how did you install DIPY?
araikes
@araikes
Posted a question on GitHub (dipy/dipy#2164) but I'll throw it here for visibility too. Ran DKI reconstruction and got some weird outputs. The DTI scalars look ok but the DKI scalars don't resemble white matter paths. Any help would be great.
Eleftherios Garyfallidis
@Garyfallidis
@RafaelNH do get back to @araikes
araikes
@araikes
Thanks @Garyfallidis
Reshu Singh
@sreshu
Hi everyone ! I got to know about DIPY via GSoD'20 :) Looking forward to working alongside developers to document cool stuffs !!
Serge Koudoro
@skoudoro
Welcome @sreshu! looking forward to your contribution!
John Samuelsson
@johnsam7
Hi everyone, I am moving over from ANTsPy to Dipy and was wondering how I can apply a diffeomorphic transform to points after the registration is done. That is, I am looking for a function that takes a set of points and a registration transformation as input and outputs the transformed points moved along the velocity field of the transform, like ants.apply_transforms_to_points in the AntsPy package. Many thanks for any help!
Alessandro Daducci
@ADaducci_twitter
Hi @Garyfallidis , how are you doing? A question to the guru: once I perform clustering with QuickBundles, how do I get access to the streamline that is the closest to the centroid in its cluster? In my application, I cannot simply take the resulting centroid, as it has a lower number of points. Any help would be appreciated! ;-)
Eleftherios Garyfallidis
@Garyfallidis
Hi @ADaducci_twitter do you mean you want to get the medoid?
Alessandro Daducci
@ADaducci_twitter
In general, once I obtain a cluster, I'd like to extract the streamline (from the input set, i.e., exactly the same geometry, nb of points etc) that best represents the cluster. We implemented a way, by recomputing the distance of each streamline from the centroid, but I was wondering whether there was a more direct way from the output of QuickBundles. For instance, without the need to recompute the distances
Eleftherios Garyfallidis
@Garyfallidis
That is the correct way to do this Alessandro. You can use an existing function called bundles_distances_mdf or bundles_distances_mam to calculate all the distances from the centroid and find the closest (most similar) streamline.
Ghost
@ghost~5e5ac431d73408ce4fdb29e3
Good morning.hope everyone is fine.I have a small problem.. I would like to minimize the test_function. basically, test_function() takes 3 variables - m, n and p. The goal is to find such values of these 3 variables that the function returns the minimal possible value.I am using nelder-mead minimization problem. while running my script for x0 = [25.0, 45.0, 10.0] i am getting error like Maximum number of function evaluations has been exceeded. Anyway I followed the stackoverflow link to write my code. https://stackoverflow.com/questions/55751317/minimize-multivariable-function please help me on this.Thanks.My data and script is attached here https://i.fluffy.cc/HLR1jCJLLV8lX4NfjSqbjRKG6DsB4bwS.html stackoverflow link is blocked in my area.
CherylMcC
@CherylMcC
Hi. I'm new to dipy and would like to set up an analysis pipeline using the free water elimination model. As I'm working through the preprocessing examples, I did not find information for motion and geometric correction. This will be an important step for me. Can you point me in the right direction? Should I use another tool to generate motion/eddy current - corrected data first? Thanks.
Matt Cieslak
@mattcieslak
Hi @CherylMcC I'd suggest trying QSIPrep: https://qsiprep.readthedocs.io/en/latest/. It will do both of these corrections with a single interpolation
CherylMcC
@CherylMcC
Thanks @mattcieslak
Vinit K Srivastava
@vinkirk

Hi all, I'm trying to save generated streamlines using StatefulTractogram but get the error message below. Also, I'm able to save tractograms from raw and eddy corrected images. However, I am getting this problem after distortion correction using ANTs. But, the streamlines are generated and can be visualized with the fury package for raw, eddy and distortion+eddy correction. Any help would be greatly appreciated. Thanks.


ValueError Traceback (most recent call last)

<ipython-input-33-25d7d840788d> in <module>()
36 from dipy.io.streamline import save_trk
37
---> 38 sft = StatefulTractogram(streamlines, img, Space.RASMM)
39 #save_trk(sft, "DTItractogram_deterministic_ctxwm-lh-postcentral_T1space-disteddy.trk")
40

~/miniconda3/envs/ants/lib/python3.6/site-packages/dipy/io/stateful_tractogram.py in init(self, streamlines, reference, space, origin, data_per_point, data_per_streamline)
119 'using them with StatefulTractogram.')
120 else:
--> 121 space_attributes = get_reference_info(reference)
122 if space_attributes is None:
123 raise TypeError('Reference MUST be one of the following:\n'

~/miniconda3/envs/ants/lib/python3.6/site-packages/dipy/io/utils.py in get_reference_info(reference)
276
277 if not affine[0:3, 0:3].any():
--> 278 raise ValueError('Invalid affine, contains only zeros.'
279 'Cannot determine voxel order from transformation')
280 voxel_order = ''.join(nib.aff2axcodes(affine))

ValueError: Invalid affine, contains only zeros.Cannot determine voxel order from transformation

Vinit K Srivastava
@vinkirk
figured out problem. affine from distortion correction had negative signs removed. once i corrected for this, am able to save the streamlines.
salomaaa
@salomaaa
hello how can i do tracking only for ROI not the hole brain??
Gabriele
@gamorosino
Is there a dipy function to perform EPI distortion corrections based on non-linear (i.e. diffeomorphic) registrations ?
Pietro Astolfi
@pietroastolfi
Hi all, I need to run SIFT2 ( https://mrtrix.readthedocs.io/en/latest/reference/commands/tcksift2.html) using the fod computed with dipy, but I have doubts on the compatibility of the two e.g., https://community.mrtrix.org/t/first-spherical-harmonic-coefficient-y-0-0-meaning/507 . Is there some of you that already experienced such situation?
Serge Koudoro
@skoudoro
Hi @salomaaa, yes, you can do tracking only for an ROI, you just need to define a mask and seed only on this mask. I think we have a tutorial for that, let me check
Hi @gamorosino, you can do eddy current correction by affinely registering all non-b0 volumes to the b0. We are planning to have a tutorial and interface about that. For the moment, you can look at the example from @Garyfallidis and add an extra step with our diffeomorphic registration: https://gist.github.com/Garyfallidis/42dd1ab04371272050221275c6ab9bd6
Serge Koudoro
@skoudoro
Hi @pietroastolfi, I suppose some people already faced this. Personally, I am not familiar with Mrtrix, I need to look into it. I hope someone else will have an answer for you. Maybe @arokem or @ShreyasFadnavis @Garyfallidis?
Eleftherios Garyfallidis
@Garyfallidis
@pietroastolfi have you looked into this PR? dipy/dipy#2191
ejb119
@ejb119
Hello all,
I have a question about the Symmetric Diffeomorphic Registration in 3D tutorial. I would like to view/save the mapping that transforms the moving image into the static image. I have tried using get_map() but this gives an error : DiffeomorphicMap' object has no attribute 'static_to_ref'. Any suggestions on how to obtain the mapping data?