These are chat archives for nextflow-io/nextflow

22nd
May 2017
Simone Baffelli
@baffelli
May 22 2017 09:42
Any likely cause for " ERROR ~ No such variable: process-- Check script 'test.nf' at line: 262 or see '.nextflow.log' file for more details"? I bet a typo somewhere, but I can't find it.
Paolo Di Tommaso
@pditommaso
May 22 2017 09:46
I guess you hit this nextflow-io/nextflow#141
Simone Baffelli
@baffelli
May 22 2017 09:47
Strange...it still worked recently,
let my try with the brackets
Indeed! It works now. Should this form be recommended as the best practice?
Paolo Di Tommaso
@pditommaso
May 22 2017 09:50
I would so when using multiple output channels
Simone Baffelli
@baffelli
May 22 2017 09:51
I do it all the time, cause I need the same data in multiple places. Was the fact that you cannot reuse the same channel multiple time influenced by FBP?
Paolo Di Tommaso
@pditommaso
May 22 2017 09:51
exactly
Simone Baffelli
@baffelli
May 22 2017 09:52
I see...still a little incovenient at times, for certain processes I'm copying the same channel 6 times with different names
Paolo Di Tommaso
@pditommaso
May 22 2017 09:54
I know, I'm trying to work on this limit tho a bit challenging, not really a bit
Simone Baffelli
@baffelli
May 22 2017 09:55
I understand, I guess that's why I was discoruaged in FBP in the first place?
Paolo Di Tommaso
@pditommaso
May 22 2017 09:55
yep, not possible at all
this means that you have 6 different parallel processes that consume the same output ?
Simone Baffelli
@baffelli
May 22 2017 09:56
True... you would need some sort of patching
that inserts a copying process inbetween
but that would require parsing the script to identify the usages in the right context i suppose
Paolo Di Tommaso
@pditommaso
May 22 2017 09:57
exactly
Simone Baffelli
@baffelli
May 22 2017 09:58
I guess it's easier in my python library, where:
  • i don't do any parsing
  • I don't have parallel processes at all
but with threads it's going to be rather though
Paolo Di Tommaso
@pditommaso
May 22 2017 09:59
yep
Simone Baffelli
@baffelli
May 22 2017 10:03
So, while I'm bothering you, just a curio: would it be possible to extent the map file(something), val(something_else) into some_channel
in a way that it directly creates a mapping instead of a regular set?
Paolo Di Tommaso
@pditommaso
May 22 2017 10:04
what so you mean exactly ?
Simone Baffelli
@baffelli
May 22 2017 10:05
I'm using some_channel.map({it->map_names(it, my_list_of_names)}) where map_names creates the mapping for me. But to do so in one place would be convenient
what so you mean exactly ?
That set () into something could create a mapping instead of a list. This would be useful when you want to aggregate several values and you don't want to rely on the order of insertion to extract/filter them later on.
Paolo Di Tommaso
@pditommaso
May 22 2017 10:07
we are discussing something somehow similar, please open a feature request, providing an example
we will continue the discussion there
Simone Baffelli
@baffelli
May 22 2017 10:08
Excellent, will do it after lunch. Makes sense not to clutter the discussion here.
Paolo Di Tommaso
@pditommaso
May 22 2017 10:13
:+1:
Simone Baffelli
@baffelli
May 22 2017 10:19
@pditommaso #343
Paolo Di Tommaso
@pditommaso
May 22 2017 10:20
:+1:
Simone Baffelli
@baffelli
May 22 2017 11:36
is it possible that I'm seeing #188
?
java.nio.channels.ClosedByInterruptException
ERROR ~ Unexpected error [IllegalArgumentException]
Paolo Di Tommaso
@pditommaso
May 22 2017 11:42
Um, can you share the full stack trace in the log file via pastabin?
Simone Baffelli
@baffelli
May 22 2017 11:42
sure
Paolo Di Tommaso
@pditommaso
May 22 2017 11:55
A bit ugly, it looks something wrong on an input definition at line 843
Is that possible?
Simone Baffelli
@baffelli
May 22 2017 11:56
I'll check
Paolo Di Tommaso
@pditommaso
May 22 2017 12:52
@baffelli have you find the problem ?
Simone Baffelli
@baffelli
May 22 2017 13:26
not yet, I was busy talking with a student
Paolo Di Tommaso
@pditommaso
May 22 2017 13:29
ok, let me know at your convenience
Simone Baffelli
@baffelli
May 22 2017 13:58
still gives me the same error
Paolo Di Tommaso
@pditommaso
May 22 2017 14:01
can you show me snippet having the problem
Simone Baffelli
@baffelli
May 22 2017 14:01
sure. Here it is
gc_stack
        .view()
        .map({item->map_outputs(item,['rate','sig_rate','sig_ph','stack_id'])})
        .collect()
        .view()
        .into{stacked_to_plot}


process plot_small_multiple{

    publishDir "$params.results/velocity_series_{first_stack_id}_{last_stack_id}.pdf"

    input:
        val(rates_and_sigmas) from stacked_to_plot
        each file(shapefile) from feature_mask_for_plot


    output:
        file(velocity_series) into gc_plot

        shell:
            //read the first and last stack ID to assign unique name
            first_stack_id = rates_and_sigmas[0]['stack_id']
            last_stack_id  = rates_and_sigmas[-1]['stack_id']
            stack_ids = rates_and_sigmas.collect{item->item['stack_id']}
            velocities =rates_and_sigmas.collect{item->item['rate']}
            '''
            plot_velocities_small_multiple.py --rates !{make_string(velocities)} --timestamps !{make_string(stack_ids)} --outfile velocity_series --mask !{shapefile}
            '''


}
Paolo Di Tommaso
@pditommaso
May 22 2017 14:02
I guess is this line
each file(shapefile) from feature_mask_for_plot
each file is not allowed (yet)
Simone Baffelli
@baffelli
May 22 2017 14:03
I see! Should I obtain it by using combine then?
Paolo Di Tommaso
@pditommaso
May 22 2017 14:03
yep
Simone Baffelli
@baffelli
May 22 2017 14:03
'cause I need the same file for all of them
I don't know wheter you are interest, but I just had an idea for another operator.
Anyway, thanks you a lot :+1: :+1:
Paolo Di Tommaso
@pditommaso
May 22 2017 14:04
very interested, open a feature request for that !
I'm very eager of user new ideas
Simone Baffelli
@baffelli
May 22 2017 14:11
#344
If I'm not mistaken, it should be fairly straighforward to implement
Paolo Di Tommaso
@pditommaso
May 22 2017 14:15
Yes, I agree. I will give it a try