Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 13:42
    Yoaaaaaaaann opened #2615
  • Jun 17 14:31
    tomdelahaije converted_to_draft #2609
  • Jun 15 02:14
    skoudoro labeled #2571
  • Jun 15 02:14
    skoudoro labeled #2571
  • Jun 09 18:30
    mbrzus commented #2571
  • Jun 07 19:15
    skoudoro labeled #2612
  • Jun 07 19:14
    skoudoro commented #2590
  • Jun 07 19:05
    skoudoro labeled #2609
  • Jun 05 12:37
    GalBenZvi opened #2612
  • Jun 01 05:31
    arokem commented #2601
  • May 31 20:46
    codecov[bot] commented #2609
  • May 31 20:08
    codecov[bot] commented #2609
  • May 31 20:07
    codecov[bot] commented #2609
  • May 31 20:07
    pep8speaks commented #2609
  • May 31 20:07
    tomdelahaije synchronize #2609
  • May 31 18:30
    codecov[bot] commented #2609
  • May 31 17:55
    codecov[bot] commented #2609
  • May 31 17:55
    codecov[bot] commented #2609
  • May 31 17:55
    pep8speaks commented #2609
  • May 31 17:55
    tomdelahaije synchronize #2609
Eleftherios Garyfallidis
@Garyfallidis
@portokalh what was the input? And how did you run the command?
Eleftherios Garyfallidis
@Garyfallidis
@AtulPhadke can you print data.shape and new_img.shape?
araikes
@araikes
Is there a CLI approach where I could use an existing mask to crop an image? dipy_median_otsu has the --autocrop option but that computes another mask potentially unnecessarily.
Mohammad Hadi Aarabi
@mohammadhadiarabi
I have a question, If I have multishell data and going to fit my data to tensor model, how can I select bval = 0 and 1000 of my data.
Shreyas Fadnavis
@ShreyasFadnavis
Hi @mohammadhadiarabi -- Great question! This would work via changing the bvals and bvecs you pass to the gtab object when you fit the data with TensorModel as follows:
Shreyas Fadnavis
@ShreyasFadnavis
# these are your original bvals and bvecs
bvals = gtab.bvals
bvecs = gtab.bvecs

# here we select only 0 and 1000 bvals
selelected_bvals = np.logical_or(bvals == 0, bvals == 1000)

# here we pick only the 3D volumes from the 4D dMRI data pertaining to 0 and 1000 that we did above. 
data_selected = data[:, :, :, selelected_bvals]

# now we also take only the bvals and bvecs of the same 0 and 1000 shells.
gtab_selected = gradient_table(bvals[selelected_bvals], bvecs[selelected_bvals])

# now you can use these to fit the DTI model
tenmodel = dti.TensorModel(gtab_selected)
tenfit = tenmodel.fit(data_selected)
Does this answer your question?
Mohammad Hadi Aarabi
@mohammadhadiarabi
@ShreyasFadnavis Thank you so much for your prompt response!
Eleftherios Garyfallidis
@Garyfallidis
@araikes I do not think we have such a CLI. Can you write an issue? Be happy to start a PR too (if you can).
Mohammad Hadi Aarabi
@mohammadhadiarabi
@ShreyasFadnavis I have another question. How can I save selected new bval, bvec files as .bval and bvec?
Behzad Golshaei
@bgolshaei
Hi, I would like to have a strain field instead of latics from image registration, Do you have any example for this?
Shreyas Fadnavis
@ShreyasFadnavis
@mohammadhadiarabi : the simplest would be to use numpy --
import numpy as np; np.savetxt('bval_filename.bval', bvals); np.savetxt('bvec_filename.bvec, bvecs)
Mohammad Hadi Aarabi
@mohammadhadiarabi
@ShreyasFadnavis Thank you
jarcosh
@jarcosh
hi, I was trying to get the residuals of my data fit on dipy.reconst.dti, to derive R^2 fit for each voxel with a separate homemade function. I'm using fit_method = 'NLLS' so I checked _nlls_err_func. I keep getting ValueError: shapes not aligned so I know I must be doing something wrong. so I have two questions: a) where should I best get the 3x3 tensor from? the tenfit object has several variables with dimensions (inputXaxis, inputYaxis, inputSlices, 3, 3) such as tenfit.evecs.T or (3, 3, inputXaxis, inputYaxis, inputslices) b) how to best use it if I simply want an array of the residuals for each voxel? I assume what I want is residuals = nlls_err_func(tensor, tenmodel.design_matrix, data) but I must be fetching something wrong
As an example of what I mean, here's a test case I tried to do with a specific voxel in the first slice to see if I could fill the array iteratively later. As far as I can tell I'm following the instructions: 3x3 matrix, design matrix, array of signal during acquisition over all acquisition directions and b values, but I get an error alt
Thanks in advance
Ariel Rokem
@arokem
@jarcosh: I think that you need to reshape the tensor to a vectore. You might want to take a look at how that's handled here: https://github.com/dipy/dipy/blob/master/dipy/reconst/dti.py#L1665-L1705
Chandana Kodiweera
@kodiweera
Is there a project outline for dipy such as a module connections graph?
Chandana Kodiweera
@kodiweera

Is there a project outline for dipy such as a module connections graph?

Such as these I created for fmriprep and qsiprep: https://github.com/nipreps/fmriprep/discussions/2517 and https://github.com/PennLINC/qsiprep/discussions/289

jarcosh
@jarcosh
@arokem thanks for your quick response, reading the docstring for that function it did not occur to me that the tensor matrix ought to be flattened. however no matter if I use np.matrix.flatten to turn the 3x3 matrix into a 9, vector or tenfit.quadratic_form.flat I still get value errors at the y = np.exp(np.dot(design_matrix, tensor)) step. my design_matrix is (16,7) shaped, taking b values and directions from acquisition instead of taking one of the default spheres from dipy, might this be causing the unexpected behavior? should I transform it in some way, like how apparent_diffusion_coef uses the transposed version? apologies if the issue is my lack of mathematical knowledge but I'm at loss here
Yashvardhan Jain
@J-Yash
Hi, I'm working on symmetric diffeomorphic 3D image registration. I was wondering if there is an easy way to calculate the jacobian determinant image of the resulting registered image in DIPY?
Shreyas Fadnavis
@ShreyasFadnavis
@kodiweera ! That is a great question. The graph that you are mentioning typically comes from NiPype -- which I believe is how the nodes are connected for a dataflow style of computation. DIPY on the other hand is a package that provides tools for what happens inside those nodes and therefore does not need a dataflow graph. However, I do know it is trivial to build these pipelines with DIPY and would like to know what exactly you need. Are you using vanilla DIPY to build pipelines?
Shreyas Fadnavis
@ShreyasFadnavis
@J-Yash : I believe this is possible on looking at the codebase -- https://github.com/dipy/dipy/blob/c846bf5a23b7e95343a9cf231df2653473602456/dipy/align/transforms.pyx#L52 is what you need I assume. I am not very well-versed with this module, so am going am tagging @Garyfallidis, @skoudoro and @arokem to correct me if I am wrong. They may also be able to provide more guidance on what you need!
Chandana Kodiweera
@kodiweera

@kodiweera ! That is a great question. The graph that you are mentioning typically comes from NiPype -- which I believe is how the nodes are connected for a dataflow style of computation. DIPY on the other hand is a package that provides tools for what happens inside those nodes and therefore does not need a dataflow graph. However, I do know it is trivial to build these pipelines with DIPY and would like to know what exactly you need. Are you using vanilla DIPY to build pipelines?

Not a nipype node grape. I was mentioning a modules connection graph. I alrady created them for fmri and qsiprep. they simply show the dependancy on the packages etc.

Mohammad Hadi Aarabi
@mohammadhadiarabi
I have a question. If I have trk files of some fibers in native space, how can I extract the mean of diffusion measures( FA/MD/...) of fiber paths using dipy?
Yashvardhan Jain
@J-Yash
@ShreyasFadnavis Thank you for the response! This seems close to what I need but looking at the code it seems it takes in 1-D array. Ideally, I need something like this method in ANTs: https://antspy.readthedocs.io/en/latest/_modules/ants/registration/create_jacobian_determinant_image.html#create_jacobian_determinant_image
The ANTs method takes in a static image (3D in my case), and a transform file (created after SyN), and calculates/returns the jacobian determinant image (in my case the log jacobian). I wonder if a functionality like this exists in DIPY? If not, it would be great if someone could help me with how I should go about implementing this in DIPY :)
Paween Wongkornchaovalit
@paween:matrix.org
[m]
Hi, I am not sure if there are anyone have asked about deep learning before. As far as I know, DWIs contains four dimensions, three for i, j, k, and another for directions; so, totally there should be #number of directions images. The question is how can we use this data to be in deep learning input layer. PS. I'm working with tensorflow. Thanks a lot for your help
Eleftherios Garyfallidis
@Garyfallidis
@paween:matrix.org DL cannot easily handle 4D data. You may want to read this paper here https://www.sciencedirect.com/science/article/pii/S1361841520300943 Also we will need more information about the actual application otherwise it is very hard to provide advice.
小秦
@xiaoqinkaihua_twitter
Hi @Garyfallidis , I'm Paween. Thank you for your reply. The application that I am planning to work with is I am trying to use to DL to estimate DTI parameters, like FA, MD. From my understanding, we need to generate these DTI parameters from model fitting, which provides maps of DTI parameters. Then we can use these maps as training label. However, since DWIs always come in 4D data, so I have no idea how can I put it in DL process.
mert
@mertyergin
Hi. I am working on prostate MRI. I have diffusion-weighted images with different b values [0,50,400,800]. I need to generate DWI images that b value is 1400. I examined the documentation of dipy but I couldn't find anything about it. Do you know any python library to generate desired b-Value Diffusion-weighted from existing DWI with different b values?
6 replies
Eleftherios Garyfallidis
@Garyfallidis
So @mertyergin , you want to predict the image at b 1400?
Do you have a specific paper in mind @mertyergin /
6 replies
AlexBadea
@portokalh
being spoiled with too many options, can somebody please recommend their favorite efficient whole brain tractography method/function, hopefully using multi cpu/multithreading? thank you!
nSpotorno
@nSpotorno
Hi all,
I am trying to fit the MAP-MRI model it looks (at least to me) that the processing takes too long.
I am running on a modern workstation with 256 GB of memory and 128 CPUs (running Ubuntu 20.04).
I am analyzing a single DWI series of 104 volumes (b-values: 0,100,1000,2500), with a resolution of 2x2x2mm3. I am following the very nice DIPY tutorial and I can successfully fit the model on a single coronal slice in about 30 mins. I am using these settings: radial_order=6, laplacian_regularization=True, laplacian_weighting=.05, positivity_constraint=True . Now I am try running the entire series and after 15 hours of computation it is still at 60%. Is it plausible or is there a problem/mistake from my side somewhere? Many thanks in advance.
Eleftherios Garyfallidis
@Garyfallidis
Hi @nSpotorno are you using peaks_from_model to call the mapmri model? If yes please specify the number of processes. Start with a small number e.g. 4 and then increase.
nSpotorno
@nSpotorno
Hi @Garyfallidis, Thanks for answering. No I am not using peaks_from_model. I am following quite literally the tutorial at https://dipy.org/documentation/1.4.1./examples_built/reconst_mapmri/#example-reconst-mapmri apart from changing the dataset. In the mean time the process is finished producing a reasonably looking rtop map, for example, but it took ~25 hours. I am quite know to both DIPY and MAP-MRI does I am assuming I missing something.
Eleftherios Garyfallidis
@Garyfallidis
And you do have cvxpy installed. Correct @nSpotorno ?
AFAIK if you run like in the tutorial it uses only one Core (CPU).
You can check that my looking at your CPU monitor.
nSpotorno
@nSpotorno
Yes, I have cvxpy installed. Interesting, when I was checking the CPU usage it looked to me it was definitively firing more than one. I have also tried using the command line wraper dipy_fit_mapmri with similar performance.
Troudi-Abir
@Troudi-Abir
hello everyone, please could you help me to compute streamlines counts with DIPY?
Eleftherios Garyfallidis
@Garyfallidis
Dear @Troudi-Abir DIPY provides multiple functions to compute counts. Start from reading this tutorial here https://dipy.org/documentation/1.4.1./examples_built/streamline_tools/#example-streamline-tools
Troudi-Abir
@Troudi-Abir
Dear @Garyfallidis thank you very much.
Behzad Golshaei
@bgolshaei
@Garyfallidis @Troudi-Abir I am using DIPY for image registration but instead of the deformed mesh, I would like to have a displacement vector field. I would like to know how can I find the displacement tensor after optimizing my images registration?
Serge Koudoro
@skoudoro
Hi @bgolshaei, I would recommend looking at this tutorial concerning the displacement vector field: https://dipy.org/documentation/1.4.1./examples_built/register_binary_fuzzy/#example-register-binary-fuzzy
Behzad Golshaei
@bgolshaei
@skoudoro Hi Serge, thank you so much. Sorry for asking maybe silly question but I have read all this toturial and even the git files but I could not extract the displacemnt from "mapping" object. I need to have the data of the object movement to use the vector field of matplot libe to drawe the displacement field.
Serge Koudoro
@skoudoro

@bgolshaei, Deforming a grid is a helpful way to visualize a displacement field. we have a function for this:

from dipy.viz import
regtools;regtools.plot_2d_diffeomorphic_map(mapping, 10, 'diffeomorphic_map.png')

if you need quiver to draw the displacement field, we do not have it yet but it would be a nice small project/contribution to do during DIPY workshop or Brainhack. You can look inside the function above to adapt it.

I hope it helps

Serge Koudoro
@skoudoro
I forgot to say that you also have mapping.get_forward_field() and mapping.get_backward_field() that return the displacement