These are chat archives for dereneaton/ipyrad

17th
Sep 2018
Mariana Vasconcellos
@marypsiboas_twitter
Sep 17 2018 18:01
Hello everyone,
I ran the entire ipyrad pipeline successfully with clustering threshold .85, but afterwards I decided to branch my assembly to try a different clustering threshold .90 and forced ipyrad to run again steps 3-7. To my surprise, the last part of the stats file (## Final Sample stats summary) of both assemblies are almost identical, the only difference is the last column "loci_in_assembly" . But, the files are not identical, only the last part. I wonder if those columns in the stats are being carried over to my new branched assembly, only updating the stats of last step 7. I am happy to share my .json file, if necessary. I believe the branching worked fine, given it created the new folders of my new assembly and those are not identical to my previous assembly. I believe this might be a bug in the processing of the stats file for a branched assembly, once all the steps have been finished already. I am happy to provide more details of my case to help the developers to recreate what happened to my assembly. Any help to get this figured out will be highly appreciated. Thanks.
Jeronymo Dalapicolla
@jdalapicolla
Sep 17 2018 21:12

Hello, I was running ajob in the version 0.7.17 in parallel, nodes=6:ppn=8,pmem=4gb, (96GB). And the job crashed in the error below.

1) I think it means I am out of memory, right? If I run in a better ppn, like 12 or 16, will it help? I have 223 samples.

2) And If I need to run again, in this version, can I run only the substeps 6 and 7 "indexing clusters" and "building database"? I didn't find an argument for substeps in the help section.
Thank you for your time,
Error:

Step 6: Clustering at 0.9 similarity across 223 samples
[####################] 100% concat/shuffle input | 0:03:33
[####################] 100% clustering across | 7:44:49
[####################] 100% building clusters | 0:02:28
[####################] 100% aligning clusters | 3:06:24
[####################] 100% database indels | 0:17:34
[ ] 0% indexing clusters | 0:05:45 ERROR:ipyrad.core.assembly:EngineError(Engine '837c35da-daaff83dbb0692f8867f5336' died while running task u'33aed23f-f52950b51c2c72c5d1081df1')
ERROR:tornado.general:Uncaught exception, closing connection.
Traceback (most recent call last):
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/zmq/eventloop/zmqstream.py", line 414, in _run_callback
callback(args, **kwargs)
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/tornado/stack_context.py", line 277, in null_wrapper
return fn(
args, kwargs)
File "<decorator-gen-141>", line 2, in _dispatch_reply
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/client/client.py", line 69, in unpack_message
return f(self, msg)
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/client/client.py", line 888, in _dispatch_reply
handler(msg)
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/client/client.py", line 817, in _handle_apply_reply
self.results[msg_id] = self._unwrap_exception(content)
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/client/client.py", line 658, in _unwrap_exception
eid = self._engines[e_uuid]
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/util.py", line 79, in getitem
return self._reverse[key]
KeyError: u'635633df-1c866bf27c1850ab3ac08145'
ERROR:tornado.general:Uncaught exception, closing connection.
Traceback (most recent call last):
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/zmq/eventloop/zmqstream.py", line 440, in _handle_events
self._handle_recv()
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/zmq/eventloop/zmqstream.py", line 472, in _handle_recv
self._run_callback(callback, msg)
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/zmq/eventloop/zmqstream.py", line 414, in _run_callback
callback(*args,
kwargs)
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/tornado/stack_context.py", line 277, in null_wrapper
return fn(args, *kwargs)
File "<decorator-gen-141>", line 2, in _dispatch_reply
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/client/client.py", line 69, in unpack_message
return f(self, msg)
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/client/client.py", line 888, in _dispatch_reply
handler(msg)
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/client/client.py", line 817, in _handle_apply_reply
self.results[msg_id] = self._unwrap_exception(content)
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/client/client.py", line 658, in _unwrap_exception
eid = self._engines[e_uuid]
File "/sw/lsa/centos7/ipyrad/0.7.17/lib/python2.7/site-packages/ipyparallel/util.py", line 79, in getitem
return self._reverse[key]
[continue....] The space isn't enough...

Isaac Overcast
@isaacovercast
Sep 17 2018 23:23
@jdalapicolla Yes, this is typically a RAM issue. More ppn might help actually. more pmem too, as much as you can get.
@marypsiboas_twitter Ayyyyyyyyyyyyyyyyyy, yes this is a known bug that i've been meaning to fix. When you branch and re-run, the stats actually aren't getting overwritten properly, which is super annoying! The assembly does run, and the stats file in the outfiles dir is correct.
Jeronymo Dalapicolla
@jdalapicolla
Sep 17 2018 23:44
@isaacovercast Thank you so much!!! Can I run only the substeps 6 and 7 "indexing clusters" and "building database" or should I rerun all step 6? I didn't find an argument for substeps in the help section.
Isaac Overcast
@isaacovercast
Sep 17 2018 23:52
Step 6 substeps should be automatically tracked, so you shouldn't have to do anything except run step 6 again.
@N-atalia Have you checked to see whether this file is there: /Users/apple/Desktop/BPP/BPP_10reps_EU1/analysis-bpp/6sp_10reps_EU_r0.out.txt
What is the content of this file?
Jeronymo Dalapicolla
@jdalapicolla
Sep 17 2018 23:55
@isaacovercast Great! Thanks!