These are chat archives for bluescarni/piranha

21st
Nov 2016
Isuru Fernando
@isuruf
Nov 21 2016 10:02
What function in pyranha uses numpy?
just a 3x3 matrix really
seemed wrong to return a list of lists
maybe I should just disable the doctest for that function?
so we don't need numpy for the test
I just thought that most people doing scientific computing on python would have numpy installed anyway
Isuru Fernando
@isuruf
Nov 21 2016 10:08
No, that's totally fine. Both numpy and mpmath are test requirements and it'll be used for tests after building the package.
Francesco Biscani
@bluescarni
Nov 21 2016 10:08
ok, cheers!
what about pep-3149?
Isuru Fernando
@isuruf
Nov 21 2016 10:09
compiled library is named, _core.so right?
Francesco Biscani
@bluescarni
Nov 21 2016 10:09
yes that's correct
you mean I should version it?
version + abi tag I mean
Isuru Fernando
@isuruf
Nov 21 2016 10:11
yes, to support multiple python versions on some OSes. not needed for conda
this is minor though
Francesco Biscani
@bluescarni
Nov 21 2016 10:11
ok I'll read about it. distutils does it automatically for the wheel name: https://pypi.python.org/pypi/pyranha
I hadn't thought about the versioning of the internal compiled module
Francesco Biscani
@bluescarni
Nov 21 2016 10:12
ahh nice!
so you are not linking to the Python library, if I read that correctly?
Isuru Fernando
@isuruf
Nov 21 2016 10:14
not on linux
Francesco Biscani
@bluescarni
Nov 21 2016 10:15
ok... I was reading a bit about it, I am not sure if pyranha needs to do it because boost.python transitively brings in a dependency on the python so
Isuru Fernando
@isuruf
Nov 21 2016 10:17
pep-3149 is needed when you install the library for multiple python 3 version side by side. Not linking on linux is a separate issue
Francesco Biscani
@bluescarni
Nov 21 2016 10:18
right I see
I think the linking stuff might matter when building manylinux1 wheels, but we'll see
Isuru Fernando
@isuruf
Nov 21 2016 10:19
multiple python versions would have the same name _core.so, so one would override the other. .py files don't matter, because it is the same for all versions
Francesco Biscani
@bluescarni
Nov 21 2016 10:21
by multiple python3 versions, you mean same python3 version compiled in multiple ways (e.g., debug vs non debug, pymalloc, etc.) or different python3 versions (e.g., 3.4 vs 3.5)?
Isuru Fernando
@isuruf
Nov 21 2016 10:21
3.4 and 3.5
or debug, pymalloc
Francesco Biscani
@bluescarni
Nov 21 2016 10:21
but wouldn't the package be in a separate site-packages directory for different python versions?
unless I am a misuderstanding and this is about packaging pyranha with multiple _core.so in a single package
Isuru Fernando
@isuruf
Nov 21 2016 10:25
Yes, site-packages have different directories for different python versions
dist-packages in debian don't though
Francesco Biscani
@bluescarni
Nov 21 2016 10:25
ah I see
I'll implement this, thanks for pointing it out
Francesco Biscani
@bluescarni
Nov 21 2016 10:30
awesome, thanks for all the work!
how is the process for when new piranha/pyranha versions are released?
Isuru Fernando
@isuruf
Nov 21 2016 10:31
You send a new PR to the above repo and one of the other maintainers merge it
Then travis-ci and circle-ci would build it and publish it automatically
Francesco Biscani
@bluescarni
Nov 21 2016 10:32
ok, so no need to go through the staged recipes?
Isuru Fernando
@isuruf
Nov 21 2016 10:32
nope
Francesco Biscani
@bluescarni
Nov 21 2016 10:32
ok awesome
Isuru Fernando
@isuruf
Nov 21 2016 10:37
conda-forge/pyranha-feedstock#1
Francesco Biscani
@bluescarni
Nov 21 2016 10:38
awesome, thanks!
I was looking at the recipe, I was wondering if we should keep piranha as a dependency?
Isuru Fernando
@isuruf
Nov 21 2016 10:41
Is there a way to stop installing the headers when building pyranha?
Francesco Biscani
@bluescarni
Nov 21 2016 10:42
not at the moment
I can add it, it's not difficult
Isuru Fernando
@isuruf
Nov 21 2016 10:42
then we should keep it as a build dependency at least
when we add it as a build dependency, conda won't notice that pyranha also installed headers
Francesco Biscani
@bluescarni
Nov 21 2016 10:43
as a workaround then you mean?
Isuru Fernando
@isuruf
Nov 21 2016 10:43
yes
Francesco Biscani
@bluescarni
Nov 21 2016 10:43
Ok. For the next version I'll fix this
I was also thinking that we should probably enable the bzip2/zlib support in the pyranha build
Isuru Fernando
@isuruf
Nov 21 2016 10:44
they are
Francesco Biscani
@bluescarni
Nov 21 2016 10:44
Isuru Fernando
@isuruf
Nov 21 2016 10:45
ah, that's a mistake. I'll add them
Francesco Biscani
@bluescarni
Nov 21 2016 10:45
thanks!
Isuru Fernando
@isuruf
Nov 21 2016 10:45
Would you mind adding the comments in the PR, so that I don't forget it when I'm updating the PR?
Francesco Biscani
@bluescarni
Nov 21 2016 10:45
certainly
Isuru Fernando
@isuruf
Nov 21 2016 10:55
we can also add msgpack-c, but there is no conda package for it yet
Francesco Biscani
@bluescarni
Nov 21 2016 10:56
Yes I'll look into providing a package for it
Francesco Biscani
@bluescarni
Nov 21 2016 11:29
fantastic :)
thanks a lot for all the effort
Isuru Fernando
@isuruf
Nov 21 2016 11:31
np
Francesco Biscani
@bluescarni
Nov 21 2016 11:32
this is really important for the project, ease of use/distribution is critical
Francesco Biscani
@bluescarni
Nov 21 2016 12:11
@isuruf how do you like Circle CI?
Isuru Fernando
@isuruf
Nov 21 2016 13:28
It's fast, never had any queues, but only one job per build is a dealbreaker
Francesco Biscani
@bluescarni
Nov 21 2016 13:29
right... it's similar to appveyor in that sense, isn't it?
Isuru Fernando
@isuruf
Nov 21 2016 13:29
yes
It's worse, because I like multiple jobs per build even if the jobs are run serially
Francesco Biscani
@bluescarni
Nov 21 2016 13:30
ah right yes... here everything happens in the same context
do you think there is any way of re-using the pip wheels on pypi as a conda package?
my understanding is that installing packages via pip from a conda environment is supported, so I imagine conda should be binary compatible with python.org's python, which would make the binary wheels compatible as well
Isuru Fernando
@isuruf
Nov 21 2016 13:33
yes. you can do pip install inside the build script
Francesco Biscani
@bluescarni
Nov 21 2016 13:33
oh right
hadn't thought about that :)
Isuru Fernando
@isuruf
Nov 21 2016 13:36
conda-forge/pyranha-feedstock#1 failed because it took more than 2 hours to build all 3
Francesco Biscani
@bluescarni
Nov 21 2016 13:36
ah damn :(
it had almost finished
Isuru Fernando
@isuruf
Nov 21 2016 13:37
I'll try make -j2. I tried make -j4, but the compiler is killed because it takes too much memory
Francesco Biscani
@bluescarni
Nov 21 2016 13:38
right let's see if that works
I need to find ways to trim down the compile time
it all depends on how many series types are exposed to python. maybe the current selection is too wide
it exposes polynomials, poisson series and divisor series with a variety of coefficient types and monomial representations
Isuru Fernando
@isuruf
Nov 21 2016 13:44
travis-ci is at the edge of timeout
44 / 50 minutes
Francesco Biscani
@bluescarni
Nov 21 2016 13:44
yes sometimes I need to re-start the build manually
I will go ahead and remove the series with MPFR coefficients for the next release
this should shave off about 20% from the build time
Isuru Fernando
@isuruf
Nov 21 2016 13:52
For make -j4 there might be an order of .cpps which would not go over the memory limit
Francesco Biscani
@bluescarni
Nov 21 2016 13:54
maybe, it's a good idea to explore
probably would need to group up one poisson series, one divisor and one/two polynomials
something like that
the poisson series are the heavies
it will get progressively better, the plan is also to remove a few things regarding poly division/gcd that would further ease the compile time
Isuru Fernando
@isuruf
Nov 21 2016 13:56
See this log, https://circleci.com/gh/isuruf/staged-recipes/93. This one used make -j4
Francesco Biscani
@bluescarni
Nov 21 2016 13:57
I think j4 is out of reach at the moment, already having j2 would be really good
clang would help as well
Isuru Fernando
@isuruf
Nov 21 2016 14:00
yes, although I don't know what clang version to use to make it compatible
Francesco Biscani
@bluescarni
Nov 21 2016 14:05
not sure either... in my experience clang has decent compatibility going back to GCC 4.8. On travis clang 3.8 is used and that seems to work happily in conjunction with GCC 4.8's libstdc++ (I believe)
it's also true that in recent versions clang has become slower and fatter, so not sure it's really worth it to go down this road
Francesco Biscani
@bluescarni
Nov 21 2016 15:28
seems like j2 worked!