Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 00:43
    cmatKhan commented #1378
  • 00:31
    pditommaso commented #1378
  • 00:19
    cmatKhan commented #1378
  • Nov 14 22:04
    wwood commented #1356
  • Nov 14 16:15
    jlboat commented #1356
  • Nov 14 15:48
    robsyme commented #69
  • Nov 14 15:03
    pditommaso milestoned #69
  • Nov 14 15:03
    pditommaso unlabeled #69
  • Nov 14 15:03
    pditommaso closed #69
  • Nov 14 15:03
    pditommaso commented #69
  • Nov 14 14:57

    pditommaso on master

    Fix Allow the output of process… (compare)

  • Nov 14 14:48
    pditommaso edited #69
  • Nov 14 13:25
    pditommaso closed #1374
  • Nov 14 13:25
    pditommaso milestoned #1374
  • Nov 14 12:19
    ewels commented #1375
  • Nov 14 08:46
    pditommaso commented #1375
  • Nov 14 08:39
    ewels commented #1375
  • Nov 14 08:11
    pditommaso commented #1375
  • Nov 13 19:18
    cmatKhan opened #1378
  • Nov 13 15:54
    sinonkt edited #1377
Daniel E Cook
@danielecook
Anyone know why Date.parse doesn't work?
Maxime Garcia
@MaxUlysse
Hi @pditommaso I have some follow up question on @drpatelh question here
I understand why we can't have traverse a directory over http, but is it possible to have something like https://raw.githubusercontent.com/maxulysse/test-datasets/sarek/file{1,2}.ext become two files: https://raw.githubusercontent.com/maxulysse/test-datasets/sarek/file1.ext https://raw.githubusercontent.com/maxulysse/test-datasets/sarek/file2.ext as it is with a regular path?
Paolo Di Tommaso
@pditommaso
yes, indeed
the glob is expanded to the actual files
Daniel E Cook
@danielecook
This works:
import java.text.SimpleDateFormat;
date_parse = new SimpleDateFormat("yyyy-MM-dd");
date_filter = date_parse.parse(params.date)
Paolo Di Tommaso
@pditommaso
well, datetime is a mess in any programing lang
Maxime Garcia
@MaxUlysse
It is?? I'll go check my tests then, I just tried and it wasn't working
Paolo Di Tommaso
@pditommaso
well, actually I'm not sure :joy:
Maxime Garcia
@MaxUlysse
It works
I must had another issue when I tried earlier
I'm guessing it'll work as well with s3://
Paolo Di Tommaso
@pditommaso
s3 can be traversed as a regular fs
Maxime Garcia
@MaxUlysse
Ok
thanks
Ashley S Doane
@DoaneAS
@rsuchecki thanks, yes pipeline is here: https://github.com/DoaneAS/realign.nf
Paolo Di Tommaso
@pditommaso
there could be something wrong in your script, NF is not supposed all that CPU
Daniel E Cook
@danielecook
@pditommaso agreed - but I'm curious why the groovy Date module doesn't work? Is it not imported?
Paolo Di Tommaso
@pditommaso
actually I'm not aware of it, therefore is not imported :smile:
is tehre a groovy-date module ?
Ashley S Doane
@DoaneAS
@rsuchecki pretty simple nextflow with only 1 process actually. Just takes a sampleindex.csv file with sample name, sample type (tumor or normal), and bam file path, and does a realignment with speedseq. The executor is sge, and the error seems to happen when nextflow is determining which results are cached.
Daniel E Cook
@danielecook
err maybe not - but the language spec suggests you should be able to do just Date.parse(format, input)
examples also present here: http://groovy-lang.org/single-page-documentation.html; Just curious maybe I'm missing something here
Paolo Di Tommaso
@pditommaso
can't check now, you may wont to report an issue
Daniel E Cook
@danielecook
I can do that
thanks
Paolo Di Tommaso
@pditommaso
welcome
Maxime Garcia
@MaxUlysse
OK, so I made a mistake earlier, it is not working in fact
I'll make a minimal example and an issue
Sri Harsha Meghadri
@harshameghadri
Hey folks, I am pretty new to using docker and singularity. I want to use the nf-core/rnaseq for my analysis. I consistently get errors while trying to execute this command singularity pull --name nf-core-rnaseq-1.3.img docker://nf-core/rnaseq:1.3
Unable to pull docker://nf-core/rnaseq:1.3: conveyor failed to get: Error reading manifest 1.3 in docker.io/nf-core/rnaseq: errors:
I am trying to pull to rackham. My analysis needs to be executed on bianca. Any tips are super appreciated.
Maxime Garcia
@MaxUlysse
try singularity pull --name nf-core-rnaseq-1.3.img docker://nfcore/rnaseq:1.3 instead, without the - in nf-core
I'm guessing if you have more question about nf-core pipelines, it should be better on our slack: https://nf-co.re/join
Sri Harsha Meghadri
@harshameghadri
I tried that as well getting the same error @MaxUlysse
Maxime Garcia
@MaxUlysse
You tried that on bianca?
Sri Harsha Meghadri
@harshameghadri
nope on rackham, I guess it needs internet
Maxime Garcia
@MaxUlysse
Sure
Just trying to find an easy mistake, sorry ;-)
Have you tried running that on an interactive node?
Sri Harsha Meghadri
@harshameghadri
hmmm on rackham? I dnt have allocation there.
Maxime Garcia
@MaxUlysse
I'm afraid singularity might be too demanding on the regular login node
Let me see if I can help you in an other way
Sri Harsha Meghadri
@harshameghadri
perfect, thank you Maxime.
Maxime Garcia
@MaxUlysse
@harshameghadri I messaged you ;-)
Marko Melnick
@Senorelegans
Is there any way to force a process to wait for another process to finish before it is started? I tryed to do it with a dummy channel but I am concatenating files to the channel amounts change from one process to the next.
Maxime Garcia
@MaxUlysse
can't you use the output from the process that need to be finished as an input for the other?
maybe with a .collect() to be sure to catch multiple executions of said process
Marko Melnick
@Senorelegans
I am concatenating fastq files by groups (I made a parser in groovy to separate by group). Is there no way to just make one process wait for another one with a dummy channel or some null variable?
I guess my real issue is I am reading channels from pairs earlier. And I have a list of the file name with condition group in a sample table that I can read, but I am struggling bringing them and operating on them by group.