Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
Brian Carcich
@drbitboy
So we now have 100% coverage in pr217, a smaller kernel set for testing in my master-staging branch, ready to issue as a pr when pr217 is merged. Let me know if you need anything changed to do the merges.
Andrew Annex
@AndrewAnnex
I made comments on the pr that you should review
Andrew Annex
@AndrewAnnex
I can also merge as is, just be aware that I will want to change certain things
Brian Carcich
@drbitboy
where are your comments? I don't see any here: AndrewAnnex/SpiceyPy#218
Brian Carcich
@drbitboy
I'm pretty sure I addressed your comments ca. 2017-10-17T00:08 EDT; the only thing I could not know about would be any docstring style issues I missed
So yes, please go ahead and merge.
Brian Carcich
@drbitboy
I also got rid of MGS/Mars kernels in testing to save another 20MB+ in the downloads.
Brian Carcich
@drbitboy
Let me know when you have it how you want it so I can merge those changes back.
Andrew Annex
@AndrewAnnex
they should be up now. I haven't really started making those changes yet but what I plan to do is merge the state of the pr and a commit that makes the changes I need so that they are recorded separately.
Brian Carcich
@drbitboy
Okay, I pushed those changes ('\0' four times, plus (1+end-begin) fix in dafgsr) to my pr217/master-with-fixes branch, so it is now part of pr218. travis passed all linux tests, failed on osx/Py2.7.13, is working of osx/Py3.x, passed the first two appeyor tests, and coverage is still at 100%.
I tweaked the docstring in dafgsr, and found another typo (happend should be happened).
Brian Carcich
@drbitboy
If you click the [Restart job] on travis-ci 1131.5, it may clear the [Some checks were not successful] flag.
I could also do a trivial edit on that last requested change to clear the other red [x].
Andrew Annex
@AndrewAnnex
@drbitboy so I merged in your changes using a merge squash to combine all the commits to one. It looks like you may have switched computers or changed your git authorship settings as currently github is not linked with your kinetx.com email account. I can fix this by changing the commit and force pushing to the repo or you can just add it to your account info at https://github.com/settings/emails
If I had saw that earlier I would have made sure to use the gmail address instead
Brian Carcich
@drbitboy
Does it matter which email account is associated with github? My main accounts are drbitboy@gmail.com and briantcarcich@gmail.com. I am wondering how you associated me with my kinetx account?
anyway, thanks, that is great.
okay, I added the kinetx email to my github account.
Andrew Annex
@AndrewAnnex
I was squashing the commits and picked the first author line I saw, I only noticed the difference in email accounts when I looked into why github wasn't crediting you. you can view the authorship for a commit by looking at the output from git log
anyways, I see that you are now credited.
Brian Carcich
@drbitboy
Great, thanks for merging the contributing edits.
I also issued a PR for the smaller kernel set. That will obviously take a lot more time to review; feel free to ping me back with any queries about what I put in there.
Also, thanks for making the contributing guide; the PRs should take less work now (e.g. I originally had not thought to add the license on the new DE-405S script).
Andrew Annex
@AndrewAnnex
@drbitboy take a look at my most recent commits, there were a number of issues with the newly wrapped ek functions that ended up needing to be fixed. although contributions are great, there were a few too many changes limiting my ability to do a proper review and I can't keep up with the number of PRs I have received either (and I am in grad school).
Brian Carcich
@drbitboy
Hi Andrew, I got into conflict hell with the whole small SPK/CK thing. so if you don't mind I am going to close PR
Brian Carcich
@drbitboy
Okay, back in business, down to 1.2MB total bandwidth for kernels.
I understand that you won't be merging these any time soon, that is fine. Let me know when would be a good time issue new PRs.
Brian Carcich
@drbitboy
Nice work on spk14.
I was just thinking, the stress tests do exactly the same thing over and over.
In fact, the underlying system (Python, garbage collector O/S, etc.) probably does exactly the same thing too i.e. after the first few passes it probably allocates exactly the same memory for each variable every time.
A way to get around that would be to change something in the underlying system each time. e.g.
guard = []
for iPass in range(500):
    guard = guard + range(555)
    test_spk14a()
Brian Carcich
@drbitboy
Yup, 33 calls to spice.spk14a, only five unique memory address pairs for coeffs and epochs:
$ git diff
diff --git a/spiceypy/spiceypy.py b/spiceypy/spiceypy.py
index 6de592e..5c5d6cc 100644
--- a/spiceypy/spiceypy.py
+++ b/spiceypy/spiceypy.py
@@ -11119,6 +11119,8 @@ def spk14a(handle, ncsets, coeffs, epochs):
     ncsets = ctypes.c_int(ncsets)
     coeffs = stypes.toDoubleVector(coeffs)
     epochs = stypes.toDoubleVector(epochs)
+    with open('spk14a_pointers.log','ab') as fOut:
+        fOut.write('{},{}\n'.format(coeffs.__repr__(),epochs.__repr__()))
     libspice.spk14a_c(handle, ncsets, coeffs, epochs)



$ python setup.py test
...


$ awk '!($0 in d){d[$0]=FNR}{print $0,FNR,d[$0]}' spk14a_pointers.log 
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb290>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb8c0> 1 1
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb4d0>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb290> 2 2
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb950>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb4d0> 3 3
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb560>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb950> 4 4
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb8c0>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb560> 5 5
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb290>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb8c0> 6 1
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb4d0>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb290> 7 2
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb950>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb4d0> 8 3
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb560>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb950> 9 4
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb8c0>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb560> 10 5
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb290>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb8c0> 11 1
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb4d0>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb290> 12 2
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb950>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb4d0> 13 3
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb560>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb950> 14 4
...
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb560>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb950> 29 4
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb8c0>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb560> 30 5
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb290>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb8c0> 31 1
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb4d0>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb290> 32 2
<spiceypy.utils.support_types.c_double_Array_80 object at 0x7f9c929bb950>,<spiceypy.utils.libspicehelper.c_double_Array_4 object at 0x7f9c929bb4d0> 33 3
Andrew Annex
@AndrewAnnex
@drbitboy there is a pr for disabling the found flag capture you might be interested in
Andrew Annex
@AndrewAnnex
I am going to make the next release soon, if there are bugs that still need to be fixed or documentation typos that anyone has noticed now is a great time to contribute fixes
Brian Carcich
@drbitboy
that's great.
Andrew Annex
@AndrewAnnex
in case you haven't seen: spiceypy 2.1.0 has been released!
Andrew Annex
@AndrewAnnex
spiceypy 2.1.1 is out!
kidpixo
@kidpixo
nice 👍 I must start using it at some point...
Chris Smola
@Smolations
wow...2 posts for all of last year??
now i'm worried no one is listening... =X
@AndrewAnnex are you around at least? (hail mary here...haha)
Andrew Annex
@AndrewAnnex
@Smolations I'm listening, tend to get most questions over email or on slack (openplanetary's slack channel). what is up?
Chris Smola
@Smolations
ahh, actually, i prefer the slack channel, but i tried to sign up and it told me i needed an invite. maybe i went about it the wrong way?
Chris Smola
@Smolations
scratch that...looks like i missed the page which includes the form to "apply" to the community. i shall do that now!
Andrew Annex
@AndrewAnnex
@Smolations well you can ask me your question here or there. joining openplanetary should be fairly quick so I was wondering why I haven't heard anything from you yet...
Chris Smola
@Smolations
i feel a bit more comfortable in slack, so i'll head over there (i was just given the invite today). now that i have access to that workspace, and given that my question(s) are related more to general orbital mechanics and not issues with SpiceyPy itself, i don't want to waste your time...unless you enjoy that type of problem solving in general.. =]
Jorge Martinez
@jorgepiloto
Hi everyone, just joined the chat :smile:
I am planning to use SpiceyPy for a University project and for sure it will be really useful :thumbsup:
Andrew Annex
@AndrewAnnex
hi Jorge, sorry I didn't see this earlier but I tend to not use gitter too much, but I am around on twitter/openplanetaryslack
Also BTW version 2.2.0 was released a few days ago! Conda forge distribution still needs to be updated though, so it may take a few days.