Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 17:05

    pditommaso on master

    Add Header provider to Google B… (compare)

  • 15:33
    pditommaso closed #3588
  • 15:33
    pditommaso commented #3588
  • 15:27

    pditommaso on master

    Bump FUSION_ prefix variables [… (compare)

  • 15:10
    marcodelapierre opened #3593
  • 14:37
    jfy133 review_requested #3589
  • 14:34
    jfy133 synchronize #3589
  • 14:32
    jfy133 synchronize #3589
  • 14:30
    pditommaso commented #3585
  • 14:25
    bentsherman commented #3585
  • 14:19
    pditommaso commented #3585
  • 14:18
    bentsherman labeled #3589
  • 14:15
    bentsherman commented #3585
  • 14:15
    bentsherman commented #3585
  • 11:45
    l-modolo opened #3590
  • 11:20
    pditommaso commented #3477
  • 10:39
    l-modolo commented #3477
  • 10:16
    pditommaso commented #3477
  • 10:14
    l-modolo commented #3477
  • 09:43
    pditommaso commented #3585
9d0cd7d2
@9d0cd7d2:matrix.org
[m]
profiles { singularity { singularity.enabled = true singularity.autoMounts = true process.container = 'alpine.3.8.simg' } }
1 reply
ChillyMomo
@ChillyMomo709

Hi all,

It seems I still have 'configuration conflict' when I run awsbatch. Like following bug: nextflow-io/nextflow#2370 .

Configuration conflict
This value was submitted using containerOverrides.memory which has been deprecated and was not used as an override. Instead, the MEMORY value found in the job definition’s resourceRequirements key was used instead. More information about the deprecated key can be found in the AWS Batch API documentation.

Nextflow version

  Version: 21.10.6 build 5660
  Created: 21-12-2021 16:55 UTC 
  System: Linux 5.11.0-1022-aws
  Runtime: Groovy 3.0.9 on OpenJDK 64-Bit Server VM 1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
  Encoding: UTF-8 (UTF-8)

How to solve this?

ChillyMomo
@ChillyMomo709
Issue does not seem to happen with NXF_VER=21.04.1 nextflow run main.nf
Jeffrey Massung
@massung
Is there a process directive I can use to fail a workflow? Maybe I can just throw an exception , but not sure if there's something nicer I should do instead? I basically have an if block in the directives section of a process, and if it's false I want to stop running and fail.
Nathan Spix
@njspix

I have a workflow that involves splitting up files per chromosome and then merging them later. To make the workflow a bit more flexible, I first pull the chromosomes out of the reference file using a bit of grep, so I have a channel with all the chromosome names. I can then do something like this:

input:
tuple val(id), path(file) from channel_a
each chr from chromosomes

output:
tuple val(id), val(chr), path(outfile) into channel_b

then I can group things up:

channel_b.groupTuple(by: 0)

and use that as input for the next process.
My question is, since the number of chromosomes is constant for any given run of the workflow, can I extract that value (e.g. map{ it.readLines().size() }) and feed that into groupTuple? I thought perhaps I could assign this value to a variable and then pass that variable to the groupTuple call but this doesn't work (the type of the variable is something fancy, not an Int).

1 reply
Shellfishgene
@Shellfishgene
The docs say that the merge operator will be removed soon. What's the replacement?
9d0cd7d2
@9d0cd7d2:matrix.org
[m]

Hi, this config worked for me :

singularity.enabled = true singularity.runOptions = "--bind /path:/path"

many thanks for your suggerence @tomraulet , finally Is working for me to!

xmzhuo
@xmzhuo

can I generate a val ouput emit from a script inside a process?
I tried to emit as val

process test {
output:
   val  val_var , emit: val_var
  shell:
"""
val_var=test
"""
}

Error:
Caused by:
Missing value declared as output parameter: val_var

I also tried to emit as env

process test{
output:
   env  val_var , emit: val_var
  shell:
"""
val_var=test
"""
}

When I tried to use it in the downstream process by calling test.out.val_var
Caused by:
No such property: val_var for class: ScriptC3863517AF925202A24F63BCD0003707

2 replies
Stathis
@isthisthat

hello, I'm just starting out in nextflow and seeking some strategic advice. What I'm trying to achieve is to run a workflow for a list of input samples. I can setup a workflow for a single sample, but how do I push multiple samples through it? (most examples in the docs show a single process) Here's where I'm stuck:

params.samples = ["samples/a","samples/b","samples/c"]
process step1 {
    input:
    file sample from samples_ch
    output:
    file 'result.bam' into step1_ch
    ...
}
process step2 {
    input:
    file bam from step1_ch
    output:
    file 'result.vcf' into step2_ch
    ...
}

This runs for sample a but not the rest, I'm suspecting because step2 only accepts one thing from step1_ch?
I can see two general strategies, either make a workflow for a single sample and then import that into a multi-sample wrapper, or enable each process to accept multiple inputs? Any advice would be greatly appreciated! Thanks

1 reply
Håkon Kaspersen
@hkaspersen
Hello everyone, I am converting my pipeline to use singularity images. I use images from biocontainers. I am trying to make my pipeline portable, and for now I have created a script that download these images to a user-specified directory to run from. However, some containers, such as R with specific packages, I could not find another solution than to build the image myself using a .def file with singularity build. This proved to work, but I am unsure how to make this portable. What are best practices regarding images, and how should i do this to optimize functionality and user-friendlyness?
9d0cd7d2
@9d0cd7d2:matrix.org
[m]
Hi, I tried to join nf-tower gitter channel, do somebody know if it's only for Enterprise users? Thanks in advance
maxulysse
@maxulysse:matrix.org
[m]
It's for everyone
9d0cd7d2
@9d0cd7d2:matrix.org
[m]
I cannot join the channel from the url, maybe need that somebody from the team puts me inside
Laurent Modolo
@l-modolo:matrix.org
[m]
Hi, I am trying unsuccessfully to implement the feedback loop pattern in DSL2. Is it possible to implement ?
4 replies
ebioman
@ebioman

if I see that correctly, the error strategy "ignore" will lead to "workflow.success"=true at the end.
I would like if running many samples to indeed ignore one failing one, but then check at the very end if any of them failed.
Is there a trace/object which can be accessed in main.nf where one could verify that at the end ? Something like

Sample success
1 true
1 true
1 true
1 true
1 false
1 true
1 true

This would allow to clean e.g. published data which would be otherwise orphan files

ebioman
@ebioman
Sorry cant edit, but obviously is should be different sample, doh
tkwitsil
@tkwitsil

I am having an issue where an imported module has an implicit workflow.onComplete handler. When I run the main workflow, the imported workflow.onComplete handler is being triggered, I assume because wf2's "workflow" is in the wf1 namespace. Example code:

# wf1.nf
include { subworkflow } from "./wf2"

// wf1 implicit workflow
workflow {
    main:
        println('wf1 implicit workflow called')
}

// pulls wf2 implicit workflow.onComplete into this namespace and executes 

----------------------------------------
#wf2.nf
//explicitly named workflow that is imported to wf1
workflow subworkflow {
    main:
        println('wf2 as subworkflow called')
}

// wf2 implicit workflow
workflow {
    main:
        println('wf2 implicit workflow called')
}

// wf2 implicit workflow.onComplete handler
workflow.onComplete {
    log.info('wf2 implicit workflow completed')
}

Command and output is:

$ nextflow run wf1.nf 
N E X T F L O W  ~  version 21.10.2
Launching `wf1.nf` [awesome_euclid] - revision: e050c16fcb
wf1 implicit workflow called
wf2 implicit workflow completed

Is there a way to avoid this namespace clash while keeping the workflow.onComplete handler for wf2? Or do I need to pull out the subworkflow in example above to it's own separate file and have wf1 import directly from that?

pouya ahmadvand
@pouya1991
Hi I am trying to use Nextflow to run singularity containers of my experiment pipeline. I am using the slurm as the executor. The problem I am facing is the the singularity has not been added to the node path and I then the Nextflow is not able to find the singularity and give this error:
env: ‘singularity’: No such file or directory
I submit my jobs through the head node which is different from the execution node.
I am wondering is there any way to append a custom path for singularity command look up?
Thanks
Peter Evans
@peterkevans
Hi all, first time poster... Sorry if this question has been asked but I'm struggling to find an answer.
I have a process that creates a bunch of fastq files and need to create a channel from them in order to scatter the next process. I'm believe I need something like the following output statement but I'm having trouble working out how to set sampleId to * i.e. the to get the sampleId from the file name. Any help would be greatly appreciated. (I'm using DSL2)
  output: 
  tuple val(sampleId), path("fastq_files/*_R{1,2}_001.fastq.gz"), emit: fastq
3 replies
awgymer
@awgymer

I have a situation where I need some dynamic input values for a process which I will need to fetch from a datastore as part of the pipeline. I was wondering what the best/accepted way of getting these values available to the process as variables is? My initial thought is to have the script that grabs the values from the datastore output a JSON file and then use a JSON reader in the process that requires them to access them?

Something like:

proc1 {
     output: 
        path patient_data.json
     script:
     """
     python get_patient_data.py
     """
} 

proc2 {
     input:
         path patient_data_file
         path other_file 
    output:
         path some_output.file
    script:
    patient_data = jsonSlurper.parse(patient_data_file)
    """
    the_command --opt1 ${patient_data['val1']} --opt2 ${patient_data['val2']} other_file
    """
}

Is this a reasonable solution? (I am aware the actual code above won't work because I haven't properly created the jsonslurper)

Pablo
@pablo-esteban:matrix.org
[m]

Hi all, I am getting a java.nio.file.ProviderMismatchException when I run the following script:

process a {
    output:
        file _biosample_id optional true into biosample_id

    script:
    """
    touch _biosample_id
    """
}

process b {
    input:
        file _biosample_id from biosample_id.ifEmpty{file("_biosample_id")}

    script:
    def biosample_id_option = _biosample_id.isEmpty() ? '' : "--biosample_id \$(cat _biosample_id)"
    """
    echo \$(cat ${_biosample_id})
    """
}

i'm using a slightly modified version of Optional Input pattern.

Any ideas on why I'm getting the java.nio.file.ProviderMismatchException?

Tim Dudgeon
@tdudgeon
Is it possible to declare that a task2 must wait for task1 to complete before it starts in the case where task1 does not create an output that can be fed to task2. In my case task1 writes to a directory and task2 reads data from that directory that task1 has created, but there is no specific output of task1. I can probably fabricate an output, but that sounds messy. To exemplify (with DSL2):
workflow mywf {

    take:
    data_dir

    main:
    task1(data_dir)
    task2(data_dir) // should wait for task1 to complete before starting
}
1 reply
Moritz E. Beber
@Midnighter
Hi, I was wondering if any of the groovy specialists have a good solution for the following: I have in a channel a tuple consisting of a hash map, a FastA file, and a CSV file. I would like to transform this in such a way that I get the hash map and FastA file plus a value from the CSV for each row in the CSV file. Thank you for any pointers.
Alaa Badredine
@AlaaBadredine_twitter
Hello, I would like some assistance with Nextflow operators please. I got many samples and each one has been fragmented and processed by chromosomes in the pipeline. At the end, I would like to collect all the chromosomes belonging to the same sample. Using .collect() ends up generating more than 9k symlinks files in the same folder, for each sample. Is there any way to collect them separately ?
1 reply
Kale Kundert
@kalekundert
Is it possible to use the DSL2 pipe operator with processes that have more than 1 input/output? For example, here's a snippet where I want to run process b 3 times on the output of a:
nextflow.enable.dsl = 2

process a {
    input:
        val x
    output:
        val y
    exec:
        y = x.toUpperCase()
}

process b {
    input:
        val x
        val n
    output:
        val y
    exec:
        y = "$x$n"
}

workflow {
    x = channel.value('a')
    n = channel.of(1..3)

    // I know these lines would work.
    //p = a(x)
    //b(p, n) | collect | view

    // Is there any way to do it all in one pipeline?
    a(x) | b(???, n) | collect | view
}
Richard Corbett
@RichardCorbett

I want to get a channel of the form:
['lib1', 'species1']
['lib1, 'species2']
['lib2', 'species3']

My process is parsing a kraken2 report text file to find any species present present above some threshold per lib:

process select_species {
    input:
        tuple val(library_id), path(kraken_report)
                val(threshold)

    output: 
                tuple val("${library_id}"), stdout , emit: species_list

    script:
    """
        awk '\$1>${threshold} && \$4=="S" { print \$NF }' ${kraken_report} | grep -v sapiens
    """
}

This gives me a tuple that contains newlines, but I feel like I'm only 1 magic nextflow command away from getting my desired output.
Current output:

[P01900, coli
ananatis
oryzae
acnes
barophilus
VB_PmiS-Isfahan
]

Desired output:

[P01900, coli]
[P01900, ananatis]
[P01900,barophilus]
...
Bioninbo
@Bioninbo
Hi all. I put all my scripts in the bin folders. But when I call R script from R it cannot find them. I.e. source('my_script.R') gives me " No such file or directory". For bash scripts it works though. I can run "my_script.sh" with specifying the path. But for perl script I also need to give the path, i.e. perl "${projectDir}/bin/my_script.pl".
1 reply
maxulysse
@maxulysse:matrix.org
[m]
I usually do just my_script.r
Bioninbo
@Bioninbo
I make my call from within R, to import functions.
maxulysse
@maxulysse:matrix.org
[m]
Oh I see
Let me check if I do that
Sorry I don't have r scripts reading another r scripts, but I'd try out giving the path as well with basedir, or projectdir
Bioninbo
@Bioninbo
Yes, that is what I do. I just wanted to make sure there was not something I missed there to make the code cleaner. Thanks for the help @maxulysse:matrix.org
Priyanka Raina
@priyanka1009_twitter
Hi I am trying to install nextflow on my windows laptop, however at second step when I type "curl -s https://get.nextflow.io | bash" on my terminal, I get the following error "cmdlet Invoke-WebRequest at command pipeline position 1
Supply values for the following parameters:
Uri:"
Magnus Manske
@magnusmanske
Hi, I have an issue with nextflow tower.
My pipeline repo uses submodules, but tower doesn't use git clone --recurcive.
I put manifest.recurseSubmodules = true into the "Nextflow config file" text area but it made no difference.
How can I make tower use --recursive?
Greg Gavelis Code Portfolio
@ggavelis
Hi, how would I join/mix more than two input tuples by key? Most of my process outputs exists as tuples (sample_id , output_file_path). As the last step in the workflow, I need a process that inputs all output files for a given sample_id (e.g. "joining" 5+ tuples by key). But join limits us to only 2 tuples, right?
5 replies
Jonathan Broadbent
@Jonbroad15
Hi I am receiving an error Missing process or function with name 'getAt' when calling a workflow. with pipeline(input)
any common causes for this error.
input is the output of a previous workflow
daudn
@daudn
How to use. the when directive as a whenever? I am passing it a map with values, and saying when val1 = my_map.key, and when it finds one, it only processes that one. I want it to process for each?
3 replies
Amanda P.
@amanda-mp

I'm trying to write the contents of a channel, which is a tuple, to a file by converting to string. Ideally I want a csv file that has a row per tuple. Something like this:

    output:
    tuple patient, sample, "${outfile}.txt" into fileChannel
}

fileChannel.map { patient, sample, outfile -> "${patient},${sample},${outfile}\n"
                    }.collectFile(name: "myoutfile.csv", sort: true, storeDir: "mydir")

But I'm getting a No such variable: patient error upon running the pipeline. I'm using the latest version of NextFlow and am wondering if the map syntax I'm using is outdated?

locdoan-genesolutions
@locdoan-genesolutions
hello
I have been using nextflow with slurm, i encountered this error

Caused by:
  Oops.. something wrong happened while creating task 'trim' unique id -- Offending keys: [
 - type=java.util.UUID value=f69e3acd-c192-4574-9c02-8921bdaf695a, 
 - type=java.lang.String value=trim, 
 - type=java.lang.String value=trimmomatic PE -phred33 -threads 6 ${READS[0]} ${READS[1]} ${sample_id}_1_trim.fastq.gz ${sample_id}_1_UP_trim.fastq.gz \
${sample_id}_2_trim.fastq.gz ${sample_id}_2_UP_trim.fastq.gz \
ILLUMINACLIP:"TruSeq3-PE-2.fa":2:30:10 \
LEADING:3 TRAILING:3 SLIDINGWINDOW:4:15 CROP:75 MINLEN:36
rm -rf ${sample_id}_1_UP_trim.fastq.gz ${sample_id}_2_UP_trim.fastq.gz
, 
 - type=java.lang.String value=/home/hieu/G4500_new/singularity_imgs/trimmomatic.sif, 
 - type=java.lang.String value=sample_id, 
 - type=java.lang.String value=12-GAAE61_S8556-S8756, 
 - type=java.lang.String value=READS, 
 - type=nextflow.util.ArrayBag value=[FileHolder(sourceObj:/samples_new/12-GAAE61_S8556-S8756_R1.fastq.gz, storePath:/samples_new/12-GAAE61_S8556-S8756_R1.fastq.gz, stageName:12-GAAE61_S8556-S8756_R1.fastq.gz), FileHolder(sourceObj:/samples_new/12-GAAE61_S8556-S8756_R2.fastq.gz, storePath:/samples_new/12-GAAE61_S8556-S8756_R2.fastq.gz, stageName:12-GAAE61_S8556-S8756_R2.fastq.gz)], 
 - type=java.lang.String value=$, 
 - type=java.lang.Boolean value=true]

nextflow.exception.UnexpectedException: Oops.. something wrong happened while creating task 'trim' unique id -- Offending keys: [
 - type=java.util.UUID value=f69e3acd-c192-4574-9c02-8921bdaf695a, 
 - type=java.lang.String value=trim, 
 - type=java.lang.String value=trimmomatic PE -phred33 -threads 6 ${READS[0]} ${READS[1]} ${sample_id}_1_trim.fastq.gz ${sample_id}_1_UP_trim.fastq.gz \
${sample_id}_2_trim.fastq.gz ${sample_id}_2_UP_trim.fastq.gz \
ILLUMINACLIP:"TruSeq3-PE-2.fa":2:30:10 \
LEADING:3 TRAILING:3 SLIDINGWINDOW:4:15 CROP:75 MINLEN:36
rm -rf ${sample_id}_1_UP_trim.fastq.gz ${sample_id}_2_UP_trim.fastq.gz
, 
 - type=java.lang.String value=/home/hieu/G4500_new/singularity_imgs/trimmomatic.sif, 
 - type=java.lang.String value=sample_id, 
 - type=java.lang.String value=12-GAAE61_S8556-S8756, 
 - type=java.lang.String value=READS, 
 - type=nextflow.util.ArrayBag value=[FileHolder(sourceObj:/samples_new/12-GAAE61_S8556-S8756_R1.fastq.gz, storePath:/samples_new/12-GAAE61_S8556-S8756_R1.fastq.gz, stageName:12-GAAE61_S8556-S8756_R1.fastq.gz), FileHolder(sourceObj:/samples_new/12-GAAE61_S8556-S8756_R2.fastq.gz, storePath:/samples_new/12-GAAE61_S8556-S8756_R2.fastq.gz, stageName:12-GAAE61_S8556-S8756_R2.fastq.gz)], 
 - type=java.lang.String value=$, 
 - type=java.lang.Boolean value=true]
locdoan-genesolutions
@locdoan-genesolutions
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:72)
    at org.codehaus.groovy.reflection.CachedConstructor.doConstructorInvoke(CachedConstructor.java:59)
    at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrap.callConstructor(ConstructorSite.java:84)
    at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59)
    at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:263)
    at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:286)
    at nextflow.processor.TaskProcessor.computeHash(TaskProcessor.groovy:1988)
    at nextflow.processor.TaskProcessor$computeHash$55.callCurrent(Unknown Source)
    at nextflow.processor.TaskProcessor.createTaskHashKey(TaskProcessor.groovy:1975)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
    at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:193)
    at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:61)
    at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:185)
    at nextflow.processor.TaskProcessor.invokeTask(TaskProcessor.groovy:591)
    at nextflow.processor.InvokeTaskAdapter.call(InvokeTaskAdapter.groovy:59)
    at groovyx.gpars.dataflow.operator.DataflowOperatorActor.startTask(DataflowOperatorActor.java:120)
    at groovyx.gpars.dataflow.operator.ForkingDataflowOperatorActor.access$001(ForkingDataflowOperatorActor.java:35)
    at groovyx.gpars.dataflow.operator.ForkingDataflowOperatorActor$1.run(ForkingDataflowOperatorActor.java:58)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Unable to hash content: /samples_new/12-GAAE61_S8556-S8756_R1.fastq.gz
    at nextflow.util.CacheHelper.hashFileContent(CacheHelper.java:350)
    at nextflow.util.CacheHelper.hashFile(CacheHelper.java:261)
    at nextflow.util.CacheHelper.hasher(CacheHelper.java:186)
    at nextflow.util.CacheHelper.hasher(CacheHelper.java:183)
    at nextflow.util.CacheHelper.hasher(CacheHelper.java:111)
    at nextflow.util.CacheHelper.hasher(CacheHelper.java:107)
    at nextflow.util.CacheHelper.hashUnorderedCollection(CacheHelper.java:376)
    at nextflow.util.CacheHelper.hasher(CacheHelper.java:174)
    at nextflow.util.CacheHelper.hasher(CacheHelper.java:178)
    at nextflow.util.CacheHelper.hasher(CacheHelper.java:111)
    at nextflow.util.CacheHelper.hasher(CacheHelper.java:107)
    at nextflow.util.CacheHelper$hasher$12.call(Unknown Source)
    at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
    at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
    at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:148)
    at nextflow.processor.TaskProcessor.computeHash(TaskProcessor.groovy:1984)
    ... 18 common frames omitted
Caused by: java.nio.file.AccessDeniedException: /samples_new/12-GAAE61_S8556-S8756_R1.fastq.gz
    at sun.nio.fs.UnixException.translateToIOException(UnixException.java:84)
    at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
    at sun.n
Hi all, does anyone knows when this kind of error happens ?
Chadi Saad
@chadisaad
Hi all, I have a channel of 1 file (output of a process)
1 reply
how to convert the file path into string ?
xmzhuo
@xmzhuo
fail to parse variable to docker in configure file, any suggestions?
docker {
    enabled = true
    runOptions = '-e var="test"'
}
1 reply
Igor Dulesov
@IgorDulesov_twitter
Hi all.
I'm not an engineer so sorry if my question looks like stupid.
We are creating desktop app for processing sequencing data from Illumina and then generating report on pathogens from the infected samples. App should be one-click installation so installation step of NextFlow should be integrated inside of one-click installation.
I know that NextFlow can be auto installed on Mac/Linux, however Windows seems like need manual steps of installation.
My question is - can it be automated OR no way?
Moritz E. Beber
@Midnighter

Can someone tell me how to use the nextflow CsvSplitter in Groovy code? I can't even seem to figure out the correct constructor.

import nextflow.splitter.CsvSplitter

CsvSplitter.options([file: 'foo.csv', sep: '\t'])

but I get a type mismatch which I don't know how to fix or even if this is the right way to do it in general.

groovy.lang.MissingMethodException: No signature of method: static nextflow.splitter.CsvSplitter.options() is applicable for argument types: (LinkedHashMap) values: [[file:foo.csv, sep:    ]]
Possible solutions: options(java.util.Map), options(java.util.Map), options(java.util.Map), print(java.lang.Object), print(java.io.PrintWriter)
    at ConsoleScript8.runScript(ConsoleScript8:6)
    at nextflow.script.BaseScript.runDsl1(BaseScript.groovy:163)
    at nextflow.script.BaseScript.run(BaseScript.groovy:200)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
Jonathan Broadbent
@Jonbroad15
Is it possible to somehow run nextflow clean -but [list of runs]
FriederikeHanssen
@FriederikeHanssen

Hi y'all! MAybe someone can help me. I have two channels, one containing normal and one containing tumor samples. I need to create a third channel that contains all tumor samples that do not have a matching key in the normal sample. Sort of a group operation spitting out only ‘remainder’. My idea now was to convert the keys from normal to a list and then ‘filter’ the tumor samples based on the key not being in the list:

        normallist = cram_variant_calling_normal_cross.map{patient, meta, cram, crai -> [patient]}.toList() 

        tumor_only = cram_variant_calling_tumor_cross.filter{ patient, meta, cram, crai ->
             !(normallist.contains(patient)
        }

This does not work. I have also tried to use collect(), collect().toList(), and .subscribe onNext: { normallist.add(it) } I have googled around a bit and found the following. https://github.com/nextflow-io/nextflow/discussions/2547 https://github.com/nextflow-io/nextflow/discussions/2275 neither have helped so far solving my problem. Any hints? Or is there a much easier way to achieve the above?

1 reply