These are chat archives for thunder-project/thunder

9th
Jun 2016
Mark Heppner
@mheppner
Jun 09 2016 13:46
I have an Images object with shape (5, 1600, 2000). How would I get the pixel values from say x,y of 500,500 for each image in the collection?
Davis Bennett
@d-v-b
Jun 09 2016 14:30
@mheppner I'm assuming when you say x,y you mean the last 2 axes? In that case, you could do images_object[:,500,500].toarray()
Mark Heppner
@mheppner
Jun 09 2016 14:33
Thanks. Is there a better way to specify exact images to load? Something like this:
rdds = [  thunder.images.fromtiff('path1'), thunder.images.fromtif('path2'), ... ]
bigRdd = sc.union(rdds)
data = thunder.images.fromrdd(bigRdd)
Jeremy Freeman
@freeman-lab
Jun 09 2016 14:58
@mheppner interesting use, that should technically work, but i'm realizing it's a bit clunky :)
you can use wildcards for paths, but that doesn't let you specify multiple paths
it'd probably be pretty easy to support lists of paths
e.g. thunder.images.fromtiff(['path1', 'path2']) does that seem useful?
or we can add a concatenate method that would give you something like
Mark Heppner
@mheppner
Jun 09 2016 15:00
Yeah, that was the problem I was running into. Supporting the list would be very helpful. However, native Spark loading functions seem to use a comma separated string. Let me double check that first...
Jeremy Freeman
@freeman-lab
Jun 09 2016 15:00
this is the other way it could be supported
data1 = thunder.images.fromtiff('path1')
data2 = thunder.images.fromttiff('path2')
data = data1.concatenate(data2)
honestly we might want to add both
ah yeah, comma separated string sounds right
Mark Heppner
@mheppner
Jun 09 2016 15:02
I guess you could check for both. If it's a string, try to split by commas, otherwise check if it's a list. The other concatenate() API would be helpful too.
Jeremy Freeman
@freeman-lab
Jun 09 2016 15:02
cool, mind opening two issues, one for each of these features?
Mark Heppner
@mheppner
Jun 09 2016 15:02
Sure thing!
Jeremy Freeman
@freeman-lab
Jun 09 2016 15:02
awesome thanks!
Mark Heppner
@mheppner
Jun 09 2016 15:10
Ok, there you go: #331, #332