Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Sep 30 16:33
    skoudoro closed #2649
  • Sep 30 16:33
    skoudoro locked #2649
  • Sep 30 16:32
    skoudoro labeled #2649
  • Sep 30 16:32
    skoudoro commented #2649
  • Sep 30 16:26
    skoudoro commented #2553
  • Sep 30 16:26

    skoudoro on master

    NF: add unbiased groupwise bund… TEST: add tests for groupwise s… DOC: add groupwise slr example and 12 more (compare)

  • Sep 30 16:26
    skoudoro closed #2553
  • Sep 30 16:25
    skoudoro edited #2625
  • Sep 29 14:23
    drombas commented #2553
  • Sep 29 12:46
    jZhuoMRI edited #2649
  • Sep 29 12:13
    codecov[bot] commented #2553
  • Sep 29 11:31
    codecov[bot] commented #2553
  • Sep 29 11:31
    codecov[bot] commented #2553
  • Sep 29 11:30
    drombas synchronize #2553
  • Sep 29 10:19
    codecov[bot] commented #2553
  • Sep 29 09:30
    codecov[bot] commented #2553
  • Sep 29 09:30
    codecov[bot] commented #2553
  • Sep 29 09:29
    drombas synchronize #2553
  • Sep 29 04:56
    jZhuoMRI opened #2649
  • Sep 29 03:44
    skoudoro edited #2625
Serge Koudoro
@skoudoro
use: source activate base
and then you can use pip install
AlexBadea
@portokalh
serge i am in py3.7.3 on terminal and 3.7.4 in jupyter and trying to synchronize those. why so many pythons?
AlexBadea
@portokalh
tried to clean things up now it looks like i may need to reinstall anaconda: conda install python=3.7; gives //anaconda3/bin/conda: //anaconda3/bin/python: bad interpreter: No such file or directory
AlexBadea
@portokalh
thanks @romainviard and @skoudoro all is well after reinstalling anaconda and dipy and reconciling the versions between jupyter and terminal! very grateful!
QuantumBrain®
@Quantum_Neuron_twitter
Is there a simple way to filter/select a subset of b-values within my dataset and gtable ? e.g. up to b<=2000
Ariel Rokem
@arokem
@Quantum_Neuron_twitter : yes, something like : idx = gtab.bvals<=2000; subset = data[..., idx]should give you a 4D array with data only from the bvals <=2000
QuantumBrain®
@Quantum_Neuron_twitter
ah great, thx Ariel
Eleftherios Garyfallidis
@Garyfallidis
Hey!! Registration for DIPY Workshop 2020 is now open. Visit https://workshop.dipy.org for information about dates, program and accommodation.
Hope to see you all in Bloomington this March for the workshop. Last year the workshop was a blast and people evaluated with 5 to 5 stars overall.
Shreyas Fadnavis
@ShreyasFadnavis
:rocket: :star2: :heart_eyes:
Eleftherios Garyfallidis
@Garyfallidis
:rocket: :heart: :rocket:
Bramsh Q Chandio
@BramshQamar
:rocket: :fire:
Kerolss
@Kerolss
Hello,
Can anyone help me?
I have got the following issue while running the below command
curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
Issue: curl: (35) schannel: next InitializeSecurityContext failed: Unknown error (0x80092012) - The revocation function was unable to check revocation for the certificate.
N. B) The installation has not done due to the PC shut down, and while I run the command again I have got the mentioned issue
curl: (35) schannel: next InitializeSecurityContext failed: Unknown error (0x80092012) - The revocation function was unable to check revocation for the certificate.
kimberlylray
@kimberlylray
@Garyfallidis Im interested in the 2020 DIPY course, could you explain the difference between the 3 and 5 day course registration options? Is it as simple as 3 day includes days 1-3 and 5 day is days 1-5?
Eleftherios Garyfallidis
@Garyfallidis
Yes, that is correct @kimberlylray
Rajikha Raja
@rajikha
Hi DIPY team, I am interested in trying out the various reconstruction models for estimating diffusion using DIPY. May I know , what are the models which can be used for reconstruction in case of single shell DWI and is there any constraints to have higher b-values such as b=2000 for using any specific model? Thank you
Ariel Rokem
@arokem
Hey @rajikha : that is the topic of this long-languishing PR: nipy/dipy#1919
So there's some information there
Rajikha Raja
@rajikha
@arokem Thank you
kimberlylray
@kimberlylray
@Garyfallidis great, thank you!
Thomas
@ThomasTT_gitlab
Hi :) I want to use another metric - MAM. Can anyone help me how to substitute this measure into QuickBundles in Python?
Serge Koudoro
@skoudoro
Hi @ThomasTT_gitlab, you can look at this blog post https://medium.com/isiway-tech/gps-trajectories-clustering-in-python-2f5874204a53 or this tutorial: https://dipy.org/documentation/1.0.0./examples_built/segment_extending_clustering_framework/#example-segment-extending-clustering-framework. Do not hesitate to send us a link of your code. We can see your mistake and it will be easier to help you.
Thomas
@ThomasTT_gitlab
@skoudoro THX a lot :)
Luke Bloy
@bloyl
Hi all, quick question.
I have a registration deformation field (from ITK or something similar) that I would like to apply to an image. it seems like differomorphic map like in this test (https://github.com/nipy/dipy/blob/05c2a75581525ccf2f32d418ebf57fb0c70d9f86/dipy/align/tests/test_imwarp.py#L569-L575) would do it,but i have two questions.
1) I only have a vector field mapping the fixed space to the moving. Where should i attach it as mapping.forward ormapping.backward.
2) Do i need to attach something to the other one
nis02002
@nis02002
Does any one where are the MRI database? both MRI image and b-vectors?
Thomas
@ThomasTT_gitlab
Hi :) As I wrote, I want to use another metric - MAM in QB.I already the distance function but something is not working :( What is wrong? I paste the code below:

class mam(Metric):
""" Computes the mam distance between two streamlines. """
def init(self):

    # For simplicity, features will be the vector between endpoints of a streamline.
    super(mam, self).__init__(feature=VectorOfEndpointsFeature())

def are_compatible(self, shape1, shape2):
    """ Checks if two features are vectors of same dimension.

    Basically this method exists so we don't have to do this check
    inside the `dist` method (speedup).
    """
    return shape1 == shape2 and shape1[0] == 1

def dist(self, v1, v2):

    track1 = np.ascontiguousarray(v1, dtype=np.float32)
    t1_len = track1.shape[0]
    track2 = np.ascontiguousarray(v2, dtype=np.float32)
    t2_len = track2.shape[0]
    # preallocate buffer array for track distance calculations

    #distances_buffer = np.zeros((t1_len ,), dtype=np.float32)
    min_t2t1 = np.zeros((t2_len ,), dtype=np.float32)
    min_t1t2 = np.zeros((t1_len ,), dtype=np.float32) 

    for t2_pi in range(0,t2_len):
        min_t2t1[t2_pi] = np.inf
    for t1_pi in range(0,t1_len):
        min_t1t2[t1_pi] = np.inf
    # pointer to current point in track 1
    t1_pt = track1
    t2_pt = track2
    # calculate min squared distance between each point in the two
    # lines.  Squared distance to delay doing the sqrt until after this
    # speed-critical loop
    for t1_pi in range(0,t1_len):
        # pointer to current point in track 2
        for t2_pi in range(0,t2_len):
            d0 = t1_pt[t1_pi][0] - t2_pt[t2_pi][0]
            d1 = t1_pt[t1_pi][1] - t2_pt[t2_pi][1]
            delta2 = d0*d0 + d1*d1 #+ d2*d2
            if delta2 < min_t1t2[t1_pi]:
                min_t1t2[t1_pi]=delta2

    for t2_pi in range(0,t2_len):
        # pointer to current point in track 2
        for t1_pi in range(0,t1_len):
            d0 = t1_pt[t1_pi][0] - t2_pt[t2_pi][0]
            d1 = t1_pt[t1_pi][1] - t2_pt[t2_pi][1]
            delta2 = d0*d0 + d1*d1 #+ d2*d2
            if delta2 < min_t2t1[t2_pi]:
                min_t2t1[t2_pi]=delta2

    # sqrt to get Euclidean distance from squared distance
    for t1_pi in range(0,t1_len):
        min_t1t2[t1_pi]=math.sqrt(min_t1t2[t1_pi])
    for t2_pi in range(0,t2_len):
        min_t2t1[t2_pi]=math.sqrt(min_t2t1[t2_pi])

    mean_t2t1 = 0
    mean_t1t2 = 0

    for t1_pi in range(0, t1_len):
        mean_t1t2+=min_t1t2[t1_pi]
    mean_t1t2=mean_t1t2 / t1_len
    for t2_pi in range(0, t2_len):
        mean_t2t1+=min_t2t1[t2_pi]
    mean_t2t1=mean_t2t1 / t2_len
    return np.min((mean_t2t1,mean_t1t2))

metric = mam()
qb2 = QuickBundles(threshold=0.15, metric=metric)
clus = qb2.cluster(streamlines)

I will be grateful for help :)
nis02002
@nis02002
how can I check the stanford database?
nis02002
@nis02002
where can I find the database before/after the operation?
Eleftherios Garyfallidis
@Garyfallidis
@nis02002 not sure iif I understand your question. But in your home folder there should be a folder called .dipy inside that folder are all the datasets that are fetched with dipy.
nis02002
@nis02002
@Garyfallidis thanks for the information, I wonder how to check the data information before fetch ( I did have folder for the fetched data, but I don't know the detail information, for example the health/age, or if the patients has certain operation before. Is the any database so that I can go all different kind of dataset?
or I wonder that is there any documents for all the datasets DIPY can fetch?
Shreyas Fadnavis
@ShreyasFadnavis
@nis02002 you might want to take a look at https://github.com/nipy/dipy/blob/master/dipy/data/fetcher.py for all the relevant information of the data. We also have a command line workflow to fetch all the datasets:
FetchFlow
nis02002
@nis02002
@ShreyasFadnavis than you, I will check the details. Looks like the datasets are from stanford lib/HCP and NTU?
nis02002
@nis02002
Anyone knows how to find dMRI dataset with subject has Alzheimer’s disease?
Brian
@bhsilverstein
Hey everybody, question about spatial transformations. I'm trying to migrate my analysis from MRtrix to dipy and encountering some challenges. Right now, I'm trying to align some streamlines with spherical ROIs. They definitely aligned in MRtrix (I used the coordinates drawn from the T1 and transformed to b0 space to generate seed/target ROIs), but after loading the streamlines with load_tractogram using the b0 image as a reference and plotting them with spherical ROIs using sphere_actor and stream_actor, the two are in very different spaces. It looks like the origin is shifted and the spatial scaling is different. Is there some transformation happening under the hood in load_tractogram somewhere? If so, I'm guessing applying the same transformation to my coordinates will solve the issue.
nis02002
@nis02002
Hi everyone, does DIPY fetch HCP datasets?
Ariel Rokem
@arokem
@nis02002: DIPY does not fetch HCP datasets, because we don't have a way to verify that you have agreed to the HCP terms and conditions
That said, we wrote a fetcher to get these data from AWS here: https://github.com/yeatmanlab/pyAFQ/blob/master/AFQ/data.py. It uses the DIPY fetcher infrastructure and extends it via boto3 to fetch data from the HCP open access bucket on AWS. It will require that you get your own HCP credentials. See instructions here: https://wiki.humanconnectome.org/display/PublicData/How+To+Connect+to+Connectome+Data+via+AWS
nis02002
@nis02002
@arokem thanks a lot, I will check it out, it will get HCP image file and b files?
Serge Koudoro
@skoudoro
Hi @ThomasTT_gitlab, I would replace the feature : IdentityFeature instead of VectorOfEndpointsFeature() and then I will remove shape1[0] == 1 on your function are_compatible. You do not need vector in your case for MAM. Let me know if it works, otherwise create an issue on DIPY, it will be easier to help you
Serge Koudoro
@skoudoro
If you want to improve your performance, you can use bundles_distances_mam function from dipy.tracking.distances in your dist function. But I recommend that you check first your version and then compare
Eleftherios Garyfallidis
@Garyfallidis
Hi @bhsilverstein , you have to either transform the streamlines or the ROIs to be in the same space. Are you using latest release version of DIPY?
If yes, then the output of load_tractogram is offering methods to do such transform.
Brian
@bhsilverstein
Thanks, @Garyfallidis . I'm using the latest release. I use Space.RASMM to load the tracks and I've tried using the to_xxxx methods after loading, but without success. They shift the tracks, but not into the space my ROI coordinates are in.
Tho I'm seeing that the affines in the streamline and b0 headers are the same. Perhaps something's off about my ROI coordinates...