These are chat archives for dereneaton/ipyrad

8th
Jun 2017
LinaValencia85
@LinaValencia85
Jun 08 2017 01:34
@isaacovercast a little bit late, but I ran the reference pipeline and it worked well. Thanks!
LinaValencia85
@LinaValencia85
Jun 08 2017 01:42
@dereneaton I am analizing some 150bp PE ddRAD data of one species of primate for some population genetic analysis. After running s1-7 I realized that many of my loci have this weird pattern, where iPYRAD identifies a SNP per each site of the last ~40bp of the loci. After looking at the data, I realized that is due to the fact that there was an incomplete digestion of the restriction enzyme in some samples. Given the fact that I have a very common enzyme not all reads are cut at the same site, and for some loci I observe two very proximate RE sites which leads to a weird alignment of the end of the loci and thus the identification of many fake SNPS. I was wondering if in iPYRAD is there any way of adaptitively trimming the loci so that the last bp of the loci are eliminated not simply on default value (using trim loci) but if like in PYRAD, loci could be trimmed to the shortest read? Thanks!!
dinmatias
@dinmatias
Jun 08 2017 02:41
@dereneaton Thanks for that notice. I've realized that there were 20 samples processed at a time and that if they are able to finish, that will be like 20% completion for the 100 samples I'm processing. So if I increase the thread, it will possibly increase the processing time per sample but it will decrease the number of samples being run in parallel?
Nic Diaz
@NebulousNic_twitter
Jun 08 2017 18:05
@dereneaton Hi, I'm collaborating with @jaecan808_twitter and trying to help troubleshoot the error we're getting. We're running ipyrad v.0.6.27. We set the paths to our barcode file and our raw data (which has a fastq.gz file extension), but we get the error when running step 1: Encountered an unexpected error, not a gzipped file. We haven't altered the data file in anyway since downloading it, so we'd really appreciate any insights you might have. Thanks! Nic