These are chat archives for nextflow-io/nextflow

6th
Apr 2017
mitul-patel
@mitul-patel
Apr 06 2017 11:59

dataDir = "/home/patel/pipelines"
params.genome = "/home/patel/pipelines/genome.fasta"
params.outdir = "$PWD"

GENOME = file(params.genome)
Genome_base = file(params.genome).name
GENOME_path = file(params.genome).parent

/*

  • Input read files
    */

Channel
.fromFilePairs("$dataDir/unmapped_{1,2}.fastq", flat: true)
.ifEmpty {error "Cannot find any reads matching: " }
.into{ reads_fastqc; reads_kallisto; reads_STAR }

process STAR_index {

executor 'local'
cpus 10
memory '32 GB'
tag { STAR: $Genome_base }

publishDir "$PWD", mode:'copy', overwrite: true

input:
file fasta from GENOME

output:
file "STAR_genome/*" into STAR_INDEX

"""
mkdir -p STAR_genome
STAR --runMode genomeGenerate --genomeDir STAR_genome --genomeSAindexNbases 3 --genomeFastaFiles $fasta

"""

}

process STAR_mapping {

executor 'local'
cpus 10
memory '32 GB'
tag { pair_id }


publishDir "$PWD", mode:'copy', overwrite: true

input:
set pair_id, file(reads) from reads_STAR
file STAR_INDEX

output:
file "STAR_mapping/${pair_id}.star*" into STAR_out

"""

mkdir -p STAR_mapping
STAR --runThreadN 10 --genomeDir STAR_genome --readFilesIn ${reads} --outSAMstrandField intronMotif --outSAMattributes All --outFilterIntronMotifs RemoveNoncanonical --outSAMtype BAM SortedByCoordinate --outFileNamePrefix ./STAR_mapping/${pair_id}.star
"""
}

I tried to run the STAR command local computer and it works...but not in nextflow..
Evan Floden
@evanfloden
Apr 06 2017 12:00
Hi @mitul-patel, welcome to the NF Gitter community.
mitul-patel
@mitul-patel
Apr 06 2017 12:02
@skptic thanks...
Phil Ewels
@ewels
Apr 06 2017 12:02
It's kind of impossible to read your code there with the broken formatting
Do you think you could put it a pastebin or something instead?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 12:03
it's a polite but demanding community ! :)
Phil Ewels
@ewels
Apr 06 2017 12:04
:blush:
mitul-patel
@mitul-patel
Apr 06 2017 12:04

dataDir = "/home/patel/pipelines"
params.genome = "/home/patel/pipelines/genome.fasta"
params.outdir = "$PWD"

GENOME = file(params.genome)
Genome_base = file(params.genome).name
GENOME_path = file(params.genome).parent

/*

  • Input read files
    */

Channel
.fromFilePairs("$dataDir/unmapped_{1,2}.fastq", flat: true)
.ifEmpty {error "Cannot find any reads matching: " }
.into{ reads_fastqc; reads_kallisto; reads_STAR

Paolo Di Tommaso
@pditommaso
Apr 06 2017 12:04
umm
please use http://pastebin.com to share your code
Maxime Garcia
@MaxUlysse
Apr 06 2017 12:05
:+1:
mitul-patel
@mitul-patel
Apr 06 2017 12:10

dataDir = '/home/patel/pipelines'
params.genome = '/home/patel/pipelines/genome.fasta'
params.outdir = '$PWD'

GENOME = file(params.genome)
Genome_base = file(params.genome).name
GENOME_path = file(params.genome).parent

/*

  • Input read files
    */

Channel
.fromFilePairs('$dataDir/unmapped_{1,2}.fastq', flat: true)
.ifEmpty {error 'Cannot find any reads matching: ' }
.into{ reads_fastqc; reads_kallisto; reads_STAR }

process STAR_index {

executor 'local'
cpus 10
memory '32 GB'
tag { STAR: $Genome_base }

publishDir '$PWD', mode:'copy', overwrite: true

input:
file fasta from GENOME

output:
file 'STAR_genome/*' into STAR_INDEX

'''

mkdir -p STAR_genome
STAR --runMode genomeGenerate --genomeDir STAR_genome --genomeSAindexNbases 3 --genomeFastaFiles $fasta

'''
}

process STAR_mapping {

executor 'local'
cpus 10
memory '32 GB'
tag { pair_id }


publishDir '$PWD', mode:'copy', overwrite: true

input:
set pair_id, file(reads) from reads_STAR
file STAR_INDEX

output:
file '${pair_id}.star*' into STAR_out

'''

mkdir -p STAR_mapping
STAR --runThreadN 10 --genomeDir $PWD/STAR_genome --readFilesIn ${reads} --outSAMstrandField intronMotif --outSAMattributes All --outFilterIntronMotifs RemoveNoncanonical --outSAMtype BAM SortedByCoordinate --outFileNamePrefix ./STAR_mapping/${pair_id}.star
'''
}

Do i have to select the syntex ?
Evan Floden
@evanfloden
Apr 06 2017 12:10
No, this pastebin link is fine: https://pastebin.com/msa0DH7q
mitul-patel
@mitul-patel
Apr 06 2017 12:11
ok thanks.,.
Phil Ewels
@ewels
Apr 06 2017 12:14
Could you tell us a little bit more about how it's not working?
How are you running with it - with what data and inputs, does it give you any log output? At what step does it break? What hardware are you using? etc...
mitul-patel
@mitul-patel
Apr 06 2017 12:23
I am using ubuntu... I am running STAR aligner to align pair of fastq files to genome. First step is to create index file and then mapping. I am getting an error at mapping step. The error is about missing output file but when I check manually the file was there....
the input are paired end files for each sample
and expected output are STAR alignment files./
Evan Floden
@evanfloden
Apr 06 2017 12:25
I have created a new pastebin with a few edits to your script here: https://pastebin.com/KwsBaSLV
I would highly suggest you look at our pipeline here: https://github.com/CRG-CNAG/CalliNGS-NF/blob/master/main.nf
It contains both processes you are trying to complete
Also, I would suggest reading through the following tutorial to understand what is happening at each step: https://public_docs.crg.es/rguigo/courses/ngs17/#_pipeline_implementaton
You can find the solutions to the problems here: https://github.com/CRG-CNAG/ngs2017-nf
Karin Lagesen
@karinlag
Apr 06 2017 13:18
ok, I am creating a publishDir for the results in my pipeline. I'd like to copy the run script and any config files into a specific dir under publishDir. Is there any provisions for doing something like this?
Phil Ewels
@ewels
Apr 06 2017 13:39
You could probably do it in a process, using ${baseDir}
eg. cat ${baseDir}/main.nf > pipeline.nf, then pick that up as an output
Phil Ewels
@ewels
Apr 06 2017 13:51
You can also print all of the loaded config I think - maybe better than printing the config files (after evaluation, so would show exactly what NF is actually using)
@pditommaso would know better about how to do this within a running pipeline though
Karin Lagesen
@karinlag
Apr 06 2017 13:53
:)
Phil Ewels
@ewels
Apr 06 2017 13:54
Might be better to do it manually with the things that you think are important though - would make it easier to read afterwards I imagine
Outside of the running pipeline you can do nextflow config main.nf to print all of the parsed config options
Karin Lagesen
@karinlag
Apr 06 2017 14:01
yep
I am paranoid enough that I want to just stuff everything into one bucket every time, just so that I know that it\s there
working with too many people with sketcy relationships to logging
one of them being me...
Karin Lagesen
@karinlag
Apr 06 2017 14:09
ok, question re fromFilePairs
I have the following files:
../testdata/short/Angen-bacDNA2-78-2013-01-4718_S29_L001_R1_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-78-2013-01-4718_S29_L001_R2_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-78-2013-01-4718_S29_L002_R1_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-78-2013-01-4718_S29_L002_R2_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-79-2013-01-4835_S30_L001_R1_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-79-2013-01-4835_S30_L001_R2_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-79-2013-01-4835_S30_L002_R1_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-79-2013-01-4835_S30_L002_R2_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-92-2013-01-5057_S44_L001_R1_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-92-2013-01-5057_S44_L001_R2_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-92-2013-01-5057_S44_L002_R1_001.short.fastq.gz
../testdata/short/Angen-bacDNA2-92-2013-01-5057_S44_L002_R2_001.short.fastq.gz
I want to get four files into one pair
I am trying the following code:
Channel
    .fromFilePairs("../testdata/short/*L00{1,2}*R{1,2}*.short.fastq.gz", size:2)
    .ifEmpty { error "Cannot find any reads matching: ${params.reads}" }
    .println()
which does work when size is 2, but not if it's 4
Phil Ewels
@ewels
Apr 06 2017 14:12
Yes, fromFilePairs is a custom NF function, I don't think it will work with two sets of groupings like that
You will probably have to write your own grouping function
Karin Lagesen
@karinlag
Apr 06 2017 14:12
but, but but... hmmm
I\m reasonably convinced @pditommaso told me it would work, a while ago
Phil Ewels
@ewels
Apr 06 2017 14:13
Ah yeah? Maybe then - I'm only guessing
Karin Lagesen
@karinlag
Apr 06 2017 14:14
:)
Phil Ewels
@ewels
Apr 06 2017 14:14
if it helps, this is the function that we used to use before fromFilePairs existed:
Channel
    .fromPath( params.reads )
    .ifEmpty { error "Cannot find any reads matching: ${params.reads}" }
    .map { path ->
        def prefix = readPrefix(path, params.reads)
        tuple(prefix, path)
    }
    .groupTuple(sort: true)
    .set { read_files }
I imagine some similar logic could work here too
I'd be interested to hear how you do this actually, as we were thinking of adding similar support for multiple lanes
Karin Lagesen
@karinlag
Apr 06 2017 14:15
if I figure it out, I will ping you :)
Phil Ewels
@ewels
Apr 06 2017 14:17
What are the file groupings that you would like to get out at the end?
Karin Lagesen
@karinlag
Apr 06 2017 14:19
everything up to the lane identifier, for instance Angen-bacDNA2-92-2013-01-5057_S44
and the lane identifier is the first glob
Phil Ewels
@ewels
Apr 06 2017 14:20
What happens if you use size: -1?
Karin Lagesen
@karinlag
Apr 06 2017 14:21
hmmm
then I get one and one file
with everything up to the first . as the grouping key
Karin Lagesen
@karinlag
Apr 06 2017 14:26
btw @ewels , you are involved in the GI-RNAseq, if I'm not mixing people?
should be NGI-RNAseq :)
Phil Ewels
@ewels
Apr 06 2017 14:27
I am :)
Karin Lagesen
@karinlag
Apr 06 2017 14:27
ok, so, the log.info part, how does that work?
couldn't find it in the docs?
Phil Ewels
@ewels
Apr 06 2017 14:35
how do you mean?
Karin Lagesen
@karinlag
Apr 06 2017 14:35
I just went ahead and tested it
stuff gets printed on screen!
Phil Ewels
@ewels
Apr 06 2017 14:35
haha, yup! :tada:
Karin Lagesen
@karinlag
Apr 06 2017 14:35
me happy :) (guess I'm easy to please :))
Phil Ewels
@ewels
Apr 06 2017 14:36
you can do log.warn and stuff too
Karin Lagesen
@karinlag
Apr 06 2017 14:36
yeah
but I can't find it in the docs though...
oh, well, my search fu is probably just off
Phil Ewels
@ewels
Apr 06 2017 14:37
I think I just copied it from someone else's pipeline
..standing on the shoulders of giants and all that ;)
Maxime Garcia
@MaxUlysse
Apr 06 2017 14:37
:+1:
Karin Lagesen
@karinlag
Apr 06 2017 14:37
copy-paste coding is generally considered good coding practice, so
Evan Floden
@evanfloden
Apr 06 2017 15:00
:laughing: :laughing:
Karin Lagesen
@karinlag
Apr 06 2017 15:04
I need that book!
@pditommaso , any suggestions on my mucho annoying fromFilePairs described a bit above?
(if you have the time)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:09
let me see
Karin Lagesen
@karinlag
Apr 06 2017 15:09
thankyou :)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:10
if you want 4 files, WHY do you specifie size: 2 ?
Karin Lagesen
@karinlag
Apr 06 2017 15:10
it works with 2, but I want 4
I have run it with both, and it fubars with 4
well, not exactly fubar, but still :)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:11
umm
and how do you match 4 files? I mean what pattern are you specifying ?
Karin Lagesen
@karinlag
Apr 06 2017 15:13
this is the entirety of the code, I'm just trying to figure out how this works
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:13
ah this snippet
Channel
    .fromFilePairs("../testdata/short/*L00{1,2}*R{1,2}*.short.fastq.gz", size:2)
    .ifEmpty { error "Cannot find any reads matching: ${params.reads}" }
    .println()
Karin Lagesen
@karinlag
Apr 06 2017 15:13
yes :)
I am pretty sure I just don't understand how the grouping stuff works
for instance, that code doesn\t seem to work atm
but this does:
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:16
wait
Karin Lagesen
@karinlag
Apr 06 2017 15:16
Channel
    .fromFilePairs("../testdata/short/*L00{1,2}*R{1,2}_001.short.fastq.gz", size:2)
    .ifEmpty { error "Cannot find any reads matching: ${params.reads}" }
    .println()
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:19
this works
params.reads = "*{1,2}_R{1,2}_001.short.fastq.gz"

Channel
    .fromFilePairs(params.reads, size:4)
    .ifEmpty { error "Cannot find any reads matching: ${params.reads}" }
    .println()
[Angen-bacDNA2-78-2013-01-4718_S29_L00, [/Users/pditommaso/Downloads/test/Angen-bacDNA2-78-2013-01-4718_S29_L001_R1_001.short.fastq.gz, /Users/pditommaso/Downloads/test/Angen-bacDNA2-78-2013-01-4718_S29_L001_R2_001.short.fastq.gz, /Users/pditommaso/Downloads/test/Angen-bacDNA2-78-2013-01-4718_S29_L002_R1_001.short.fastq.gz, /Users/pditommaso/Downloads/test/Angen-bacDNA2-78-2013-01-4718_S29_L002_R2_001.short.fastq.gz]]
[Angen-bacDNA2-79-2013-01-4835_S30_L00, [/Users/pditommaso/Downloads/test/Angen-bacDNA2-79-2013-01-4835_S30_L001_R1_001.short.fastq.gz, /Users/pditommaso/Downloads/test/Angen-bacDNA2-79-2013-01-4835_S30_L001_R2_001.short.fastq.gz, /Users/pditommaso/Downloads/test/Angen-bacDNA2-79-2013-01-4835_S30_L002_R1_001.short.fastq.gz, /Users/pditommaso/Downloads/test/Angen-bacDNA2-79-2013-01-4835_S30_L002_R2_001.short.fastq.gz]]
[Angen-bacDNA2-92-2013-01-5057_S44_L00, [/Users/pditommaso/Downloads/test/Angen-bacDNA2-92-2013-01-5057_S44_L001_R1_001.short.fastq.gz, /Users/pditommaso/Downloads/test/Angen-bacDNA2-92-2013-01-5057_S44_L001_R2_001.short.fastq.gz, /Users/pditommaso/Downloads/test/Angen-bacDNA2-92-2013-01-5057_S44_L002_R1_001.short.fastq.gz, /Users/pditommaso/Downloads/test/Angen-bacDNA2-92-2013-01-5057_S44_L002_R2_001.short.fastq.gz]]
Karin Lagesen
@karinlag
Apr 06 2017 15:21
ahhhh I get it
I can only have one star
and that star is on the left side on where the variable stuff begins :)
mitul-patel
@mitul-patel
Apr 06 2017 15:21
Many thanks @skptic , @pditommaso and @ewels . I have solved the problem now.... I have a question regarding the process execution... Will in my pipeline there are 5 process. when I run the pipeline the process submitted randomly not the order as they appear into workflow.nf file. is this ok? Is there any options to submit them in the specific order ?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:23
@karinlag yes, the star need to capture the prefix
Karin Lagesen
@karinlag
Apr 06 2017 15:24
That makes sense :)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:24
@mitul-patel because they are executed in .. parallel! thus the execution order is not deterministic
mitul-patel
@mitul-patel
Apr 06 2017 15:26
@pditommaso . Thanks., That make sense.... Can i specify how many process to submit parallel. is there any parameter for that?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:27
yes, you can the -qs n (queue size) command line option
Evan Floden
@evanfloden
Apr 06 2017 15:29
And I guess it is obviously limited by the number of CPUs you have!?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:30
when using the local executor yes, when using a batch scheduler I think the default is 100
Evan Floden
@evanfloden
Apr 06 2017 15:30
:+1:
Karin Lagesen
@karinlag
Apr 06 2017 15:44
....how do I go about getting whatever is in a channel into a list or something?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:45
uh? what do you mean ?
Karin Lagesen
@karinlag
Apr 06 2017 15:45
My results are a bunch of directories which I then put into a channel
i.e. I have a process that runs for several datasets that all stuff their results into the same channel
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:46
are we talking the previous snipper or a different piece of code ?
Karin Lagesen
@karinlag
Apr 06 2017 15:46
not the previous snippet
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:47
ok, combine collect
Karin Lagesen
@karinlag
Apr 06 2017 15:48
awesome, thanks! I suspect that will solve things :)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 15:48
hope so :)
mitul-patel
@mitul-patel
Apr 06 2017 16:08
Can I restart the workflow if it was exited with the error? I mean resume the workflow from where it stopped last time?
Evan Floden
@evanfloden
Apr 06 2017 16:08
with -resume
See here
mitul-patel
@mitul-patel
Apr 06 2017 16:12
Great.. Thanks..
Karin Lagesen
@karinlag
Apr 06 2017 17:27
So, inside of a process, is there any way to figure out what is really coming/being sent out through the channels? Can I print someting in the input/output declarations?
Evan Floden
@evanfloden
Apr 06 2017 17:29
What about doing an echo of your NF variables?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 17:30
not a specific feature for that
you can use view to print the content of a specific in/out channel
Karin Lagesen
@karinlag
Apr 06 2017 17:31
...how do I use view inside of a process..?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 17:32
not exactly inside a process, eg.
Karin Lagesen
@karinlag
Apr 06 2017 17:33
my reason for doing this is to figure out what\s in it, not completely there when it comes to NF yet
Paolo Di Tommaso
@pditommaso
Apr 06 2017 17:35
process foo {
  input: file x from x_ch.view()
  output: file z into z_ch 
  """
  command
  """
}

z_ch.view()
Karin Lagesen
@karinlag
Apr 06 2017 17:35
thanks!
Paolo Di Tommaso
@pditommaso
Apr 06 2017 17:35
:+1:
Karin Lagesen
@karinlag
Apr 06 2017 18:19
slight conundrum
// Summarize MLST results
process run_ariba_mlst_summarize {
    publishDir params.out_dir + "/" + params.mlst_results, mode: 'copy'

    input:
    file pair_id_mlst from pair_id_mlst.collect()

    output:
    file "mlst_summarized_results.tsv" into mlst_summarized

    """
    echo ariba summary mlst_summarized_results.tsv ${pair_id_mlst}/report.tsv

    """
}
this is almost what I need, but not quite there yet
pair_id_mlst is a channel full of directories
my command in the script section takes in one to many of dirname/report.tsv after the output filename (i.e. mlst_summarized_results.tsv file here)
what I do get as my command is:
ariba summary mlst_summarized_results.tsv Angen-bacDNA2-79-2013-01-4835_S30_L001 Angen-bacDNA2-92-2013-01-5057_S44_L001 Angen-bacDNA2-78-2013-01-4718_S29_L001/report.tsv
so the last one is correct, and the other Angens should have a /report.tsv after them
and since they're all the same filename inside of the directory, I cant stash them in the same directory
Paolo Di Tommaso
@pditommaso
Apr 06 2017 18:25
ok
if you use
file 'some_prefix_*' from pair_id_mlst.collect()
it will create unique dir names with that prefix
would that work ?
Karin Lagesen
@karinlag
Apr 06 2017 18:26
not sure I'm sure I understand?
Angen-bacDNA2-79-2013-01-4835_S30_L001 and the other directories all contain a file called report.tsv
Paolo Di Tommaso
@pditommaso
Apr 06 2017 18:26
I'm understanding that your problem is that all dirs have the same name, right?
Karin Lagesen
@karinlag
Apr 06 2017 18:27
not the dirs, the files inside them
Paolo Di Tommaso
@pditommaso
Apr 06 2017 18:27
ahh
if so, no magic trick here
or you are some bash snippet to rename them somehow
of you need to rename that files in the upstream process ie. where they are created
Karin Lagesen
@karinlag
Apr 06 2017 18:28
I can do that after the cmd... hmmm.... I think I will try that :)
Karin Lagesen
@karinlag
Apr 06 2017 18:53
I have a process that runs three times, due to three different data sets
the last one of them to run (no matter which dataset it is) fails with a curious error
process run_ariba_mlst_pred {
    tag {$pair_id}
    publishDir params.out_dir + "/" + params.mlst_results, mode: 'copy'

    input:
    set pair_id, file(reads) from read_pairs_mlst
    file "mlst_db" from mlst_db

    output:
    file "${pair_id}_report.tsv" into pair_id_mlst

    """
    ariba run mlst_db/ref_db ${reads} ${pair_id}
    cp ${pair_id}/report.tsv ${pair_id}/${pair_id}_report.tsv

    """
}
and the error:
Command output:
  self.mlst_profile_file mlst_db/ref_db/pubmlst.profile.txt
  self.report_file_filtered Angen-bacDNA2-78-2013-01-4718_S29_L001/report.tsv
Paolo Di Tommaso
@pditommaso
Apr 06 2017 18:56
um
change in the task work dir and try to debug the task using
bash .command.run
Karin Lagesen
@karinlag
Apr 06 2017 19:01
the thing that I am wondering about is that this happens with the last one that is run, I've seen this with all of the tree datasets
and I can't get command.run to run, because it says that the output directory already exists
Output directory already exists. ARIBA makes the output directory. Cannot continue.
ariba is the sw that I'm running
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:02
well, delete it before re-running it!
Karin Lagesen
@karinlag
Apr 06 2017 19:03
doh
having a homer moment here I think :)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:03
ahah
NF doesn't do magic
Karin Lagesen
@karinlag
Apr 06 2017 19:04
yes it is!
at least it will be magic once I get it to work
I know this will blow the pants off my coworkers when I deploy the sw :)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:04
nope, just wrote and launch the task scripts fro you
but you can execute each of them manually ;)
Karin Lagesen
@karinlag
Apr 06 2017 19:05
I get the exact same error
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:05
ahaha, I would see
there must be a reason!
edit the .command.sh and add a set -x on top
thus, you will trace each single line
Karin Lagesen
@karinlag
Apr 06 2017 19:07
so, at the very top? Before bash? or after?
I've usually just done this by adding a -x to the bash command...
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:07
just after the shebang eg. #!/bin/bash
Karin Lagesen
@karinlag
Apr 06 2017 19:07
ack
emmm so you want #! set -x /bin/bash ?
(or is it just getting late enough for me that I just confuse myself)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:09
nope
#!/bin/bash
set -x
<your script>
Karin Lagesen
@karinlag
Apr 06 2017 19:09
done
:)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:10
:D
Karin Lagesen
@karinlag
Apr 06 2017 19:11
now this was nice and predictable
error gone
eh, no
just waw it
you want me to ship you the file in some way?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:12
I can't help much there
Karin Lagesen
@karinlag
Apr 06 2017 19:13
but what does the self.mlst_profiel_file things mean?
that is not from the sw
I do think that is from NF...?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:13
no, it isn't
Karin Lagesen
@karinlag
Apr 06 2017 19:13
ah, ok
in that case, I will go beat the other sw over the head with the error :)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:14
if you delete it
Karin Lagesen
@karinlag
Apr 06 2017 19:14
...delete what?
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:14
then, when you run bash .command.run it's recreated ?
self.mlst_profiel_file I mean
Karin Lagesen
@karinlag
Apr 06 2017 19:14
yes, the error is re-created
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:14
so, it's your command
unless ..
Karin Lagesen
@karinlag
Apr 06 2017 19:15
yes - but it works with the other 3 sets, and the set it fails with is always the last to run with NF...
other 2 sets, that should have been
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:15
can you share the .command.run content by using http://pastebin.com
Karin Lagesen
@karinlag
Apr 06 2017 19:15
yep!
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:17
it's not created by NF, ergo it's your command .. :/
Karin Lagesen
@karinlag
Apr 06 2017 19:17
ok thanks for helping me figure that out!
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:18
welcome
Karin Lagesen
@karinlag
Apr 06 2017 19:18
and on that note, I think it's time for dinner and bed :)
(and again, this channel is a godsent!)
Paolo Di Tommaso
@pditommaso
Apr 06 2017 19:18
indeed !
that's the best way to fix bug
start with a fresh mind the day after ;)
Karin Lagesen
@karinlag
Apr 06 2017 19:19
yep :)