These are chat archives for bluescarni/pagmo_reborn

11th
Jun 2016
Dario Izzo
@darioizzo
Jun 11 2016 07:56
I agree ( in general).
Dario Izzo
@darioizzo
Jun 11 2016 08:23
The line I have troubles in testing is:
pagmo_throw(std::invalid_argument,"The size of the (dense) gradient "
                        "sparsity is too large");
which triggers if:
if (nx > std::numeric_limits<vector_double::size_type>::max() / nf)
now nx is the dimension of the chromosome, which is deducted from:
const auto nx = get_nx();
which is basically the size of the vector_double returned by get_bounds.
So two problems there:
1) If we must return a vector_double of big size from get_bounds(), that is (say) std::numeric_limits<vector_double::size_type>::max() / 2, that will throw as memory is just not enough on any modern computer I used in the last two days
Dario Izzo
@darioizzo
Jun 11 2016 08:28
2) Even if we manage to return such big vectors, (maybe changing the upper limit request to 1/100u or something), to init these vectors takes ages (like 5-20 seconds) which makes the test super long
Conclusion: I suggest to keep the lines and skip the test. Or we introduce somehow a hard limit on the sizes of nobj, nx, nf nc, nec nic etc. (which would require a redesign and I personally do not like as a solution)
Francesco Biscani
@bluescarni
Jun 11 2016 08:40
no it's ok let's keep it like it is
this is one of the unrealistic scenarios for full coverage
Francesco Biscani
@bluescarni
Jun 11 2016 08:59
btw on a 64 bit architecture we are talking about the order of 2^64 to trigger the error
roughly 16777216 Terabytes of RAM
Francesco Biscani
@bluescarni
Jun 11 2016 09:11
1/100 will not save you :)
Dario Izzo
@darioizzo
Jun 11 2016 10:09
1000?
😉
Francesco Biscani
@bluescarni
Jun 11 2016 11:38
I have a feeling our docs are getting all over the place
there's much repetition of information for instance, especially in the error strings, and the information is often slightly incorrect or contradictory
there seems to be not a common philosophy about how we markup things (are types to be typed like this? are params to be typed like this? it's all over the place)
nor consistency in how we report exceptions, it often looks like we are just reporting some of the stuff that might be thrown, depending on the weather and on the day of the week
I am having really a hard time understanding how to write the Python docs because it seems like there's no common guidelines emerging
maybe that's the way community project works, organized chaos :)
Francesco Biscani
@bluescarni
Jun 11 2016 11:46
for me personally consistency and attention to details are very important. Whenever I look at the webpage of a project and I find: frequent grammatical errors, punctuation horrors, inconsistency in the information reported, dead links, shitty/inconsistent html/markup, etc. I lose much of my will to actually use that project and I come back with a bad taste in my mouth
for me it's much worse than learning something difficult... at least if I invest the time to learn the philosophy of something but then there is consistency, it's much more useful than something that tries to be as user-friendly as possible but then once you look closely is full of small issues all over the place
but anyway, rant over... I guess I'll turn to write tests rather than docs :)
Francesco Biscani
@bluescarni
Jun 11 2016 11:53
just really kills my motivation
Francesco Biscani
@bluescarni
Jun 11 2016 13:42
@darioizzo are you going to do the changes to the hessians/gradient override or should I do them in my branch?
Dario Izzo
@darioizzo
Jun 11 2016 16:20
@bluescarni The changes were already made on the test branch and committed (pushed)
On the docs, can you give a specific case?
Francesco Biscani
@bluescarni
Jun 11 2016 17:28
I see changes only in the sparsity methods and in the set seed, but not in the gradient/hessians overrides. Am I missing something?