Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 08:27
    llewellyn-sl synchronize #3475
  • 02:20
    marcodelapierre opened #3479
  • Dec 07 13:44
    pditommaso commented #3367
  • Dec 07 13:42
    marcodelapierre commented #3012
  • Dec 07 13:35
    pditommaso milestoned #3012
  • Dec 07 13:35
    pditommaso closed #3012
  • Dec 07 13:35
    pditommaso commented #3012
  • Dec 07 13:33
    marcodelapierre commented #3012
  • Dec 07 13:30
    pditommaso milestoned #3472
  • Dec 07 13:06
    phue commented #3367
  • Dec 07 13:05
    phue commented #3367
  • Dec 07 13:01
    marcodelapierre commented #3012
  • Dec 07 13:01
    marcodelapierre opened #3478
  • Dec 07 12:33
    l-modolo commented #3367
  • Dec 07 11:59
    phue commented #3367
  • Dec 07 11:56
    phue commented #3367
  • Dec 07 11:55
    phue commented #3367
  • Dec 07 11:52
    phue commented #3367
  • Dec 07 11:43
    l-modolo commented #3367
  • Dec 07 11:38
    phue commented #3367
Philip Jonsson
@kpjonsson
Question: In Nextflow version 19.07.0.5106, I get WARN: The channelcreatemethod is deprecated -- it will be removed in a future release when I use the Channel.create() method, but I find nowhere any mention of what's replacing it. Is the warning wrong or is there some undocumented feature I'm missing?
Ghost
@ghost~57581dbcc2f0db084a1ccd04
Don't worry, it won't happen overnight
AFAIK you can safely already remove the stuff like ch_input = Channel.create()
Philip Jonsson
@kpjonsson
@MaxUlysse Got it, thanks.
Ghost
@ghost~57581dbcc2f0db084a1ccd04

And I do believe that was the important part:

to me the only use for create that makes sense is when used with choice

Paolo Di Tommaso
@pditommaso
this is just great, isn't it ?
Hugues Fontenelle
@huguesfontenelle
hi!
How do I read the documentation for an older release? ie 19.04
well I guess I can simply read in github ..
Paolo Di Tommaso
@pditommaso
indeed
evanbiederstedt
@evanbiederstedt

@pditommaso
this is just great, isn't it ?
https://twitter.com/yokofakun/status/1159468857934929922

Confirmed, this is great

evanbiederstedt
@evanbiederstedt

https://gitter.im/nextflow-io/nextflow?at=5d4802a1475c0a0feb021c1b

Could you give us a few examples @pditommaso to illustrate this point? It still feels a bit abstract without a few concrete examples

Pierre Lindenbaum
@lindenb
@evanbiederstedt @pditommaso thanks ! I'm still not convinced it's the right way to implement the idea. 1) one can still use a bash script to extract the data 2) or it would be better using a 'static' method: something like:Channel.fromMap("my.bam").flatMap(Htsjdk::extractSamples)
Taylor Falk
@taylor.f_gitlab
Is there an easy way to pass two channel outputs and some associated strings to a process? I'm currently trying output = Channel.from(['filter1', channel1_out], ['filter2', channel2_out]) but this only returns the blank DataflowQueue(queue=[]) string inside the process.
Stijn van Dongen
@micans
@taylor.f_gitlab that would mean you'd have channels in a channel. What are you trying to achieve? What's in channel1_out and channel2_out?
Taylor Falk
@taylor.f_gitlab
@micans those channels contain singular files, so I am really just trying to run the next process on each output from channel1 and channel2, and passing along the correct filter string. How do I move those files into one channel, concat?
Stijn van Dongen
@micans

@taylor.f_gitlab do you have a source process? Normally you'd see something like

output: set val('filter'), file('*.txt') into channel1

then if there are multiple files in that output that you want to flatten you can use transpose().

Taylor Falk
@taylor.f_gitlab
Oh that's a good idea, let me try it that way.
Gabriel Abud
@G_Abud_twitter
I'm getting the error: fatal error: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied when trying to run a pipeline on AWS Batch. Pretty sure it's due to the s3 access of the workDir but I've double checked and I am the owner of that bucket. Any ideas?
Combiz
@combiz_k_twitter
Hi, this may be more of a singularity question than a nextflow one. I'm trying to run a test hello world nextflow script with -with-singularity image.sif on an HPC cluster. There seems to be an issue with mounting as the short script runs but produces an error `Command error: Fatal error: cannot create 'R_TempDir'. Any ideas? My guess is the singularity container can't write files to the HPC filesystem? Thanks for any pointers.
Combiz
@combiz_k_twitter
Ok so the script now runs with containerOptions '-B $PWD:/tmp' in the process
Johannes Alneberg
@alneberg

A question regarding input of memory specifications. Using the config

params {
  memParam = 7.GB
}

process {
  withName:sayHello {
    memory = {2 * params.memParam}
  }
}

works fine when memParam is not used, but when it is: nextflow run main.nf -c base.conf --memParam 8.GB, I get either

ERROR ~ Error executing process > 'sayHello (3)'

Caused by:
  No signature of method: java.lang.Integer.multiply() is applicable for argument types: (java.lang.String) values: [8.GB]
Possible solutions: multiply(java.lang.Character), multiply(java.lang.Number)

or

Error executing process > 'sayHello (1)'

Caused by:
  Not a valid 'memory' value in process definition: 8.GB8.GB

depending on the order of the multiplication. What am I missing? (I'm on Nextflow version 19.04.1 build 5072)

Paolo Di Tommaso
@pditommaso
umm, because when passing on the command line the string 8.GB does not get parse to Memory unit object
therefore for the interpreter it's a string
Johannes Alneberg
@alneberg
Is it possible to go around?
Paolo Di Tommaso
@pditommaso
you may try 2 * (params.memParam as nextflow.util.MemUnit)
quite ugly
Johannes Alneberg
@alneberg
Well, rather ugly than broken, I'll give it a try
it throws me unable to resolve class nextflow.util.MemUnit
Alaa Badredine
@AlaaBadredine_twitter
not Util ?
Johannes Alneberg
@alneberg
Same error using Util I'm afraid
Alaa Badredine
@AlaaBadredine_twitter
did you put the parenthesis ?
Johannes Alneberg
@alneberg
Yes:
process {
  withName:sayHello {
    memory = {2 * (params.memParam as nextflow.util.MemUnit)}
  }
}
Alaa Badredine
@AlaaBadredine_twitter
nextflow.util.MemUnit()
maybe he meant it like that ?
Johannes Alneberg
@alneberg
That gave me some syntax error: expecting ')', found '(' @ line 7, column 60.
I think I fixed it
Alaa Badredine
@AlaaBadredine_twitter
nice
Johannes Alneberg
@alneberg
nextflow.util.MemoryUnit did the trick
Alaa Badredine
@AlaaBadredine_twitter
oh neat
Johannes Alneberg
@alneberg
Thank you both!
Alaa Badredine
@AlaaBadredine_twitter
you're welcome !
Combiz
@combiz_k_twitter
I'm having some trouble with finding output files when NF is run via singularity. I write a file called 'test.csv' and NF gives " Missing output file(s) test.csv expected by process". If I cd to the NF workdir /rdsgpfs/general/ephemeral/user/ck/ephemeral/TestNF/work/f2/aca9181e283b109ffe55dc5e73d66a I can see the test.csv was produced. The file is saved to the workdir in R using write.table(df, "test.csv") )
Steve Frenk
@sfrenk

I've just started playing around with DSL-2 and I'm trying to pass the output of a process into a new, named channel. I need to join this output channel with another channel further down the workflow, hence chaining opperators directly from the process call doesn't work. I have a script that does something like this:

process1(parameters)

outputChannel = process1.out
    .ifEmpty {
                    error "Stuff not produced"
                }
                .map { <do something> }

But I get the error:

nextflow.Session - Session aborted -- Cause: No signature of method: nextflow.script.ChannelArrayList.ifEmpty() is applicable for argument types: (Script_48650d62$_runScript_closure7$_closure12) values: [Script_48650d62$_runScript_closure7$_closure12@749f539e]

What am I doing wrong?

Steve Frenk
@sfrenk
Also, unrelated question - what's the current status of the potential DSL-2 unit testing feature?
Michael L Heuer
@heuermh
@lindenb @pditommaso Curious if you might write up how your extensions work, and what the right way to do extensions might be; I've wanted to do similar in the past, but adding new dependencies to Nextflow isn't desireable for various reasons
Paolo Di Tommaso
@pditommaso
file splitting? Extend this, Pierre is using a different approach consisting in helper method returning a closure doing the parsing
Stephen Kelly
@stevekm

@taylor.f_gitlab

Hi all, pretty sure I've scoured the docs with no results, but is there any syntax for have a file object work similar to a non-consumable value? For instance, a reference fasta that is getting used multiple times throughout a pipeline. Is there no better way than using .fromPath() each time?

Channel.fromPath('genome.fa').into { ref_fasta1; ref_fasta2; ref_fasta3, .... etc. }

I think in the new DSL2 for Nextflow you no longer have to do this, you can just set it once and use it repeatedly.