These are chat archives for bluescarni/pagmo_reborn

19th
May 2016
Dario Izzo
@darioizzo
May 19 2016 06:58
Problem on the design:
Right now we thought to code algorithms able to solve stochastic problems by putting in their evolve something like this:
``````if(prob.is_stochastic()) {
// change seed
prob.set_seed(std::uniform_int_distribution<unsigned long long>()(m_e));
// re-evaluate the whole population
for (vector_double::size_type i = 0u; i < dim; ++i) {
pop.set_xf(i, pop.get_x()[i], prob.fitness(pop.get_x()[i]));
}
}``````
But, since `prob` can only be obtained as a const reference from `pop.get_problem()`, this will generate a compile time error as it violates const correctness. Solutions appear to be:
• We mark problem::set_seed as const and let the user implement the problem seed m_seed as mutable.
• We expose a set seed in population, since population has access to the problem as a data member.
• We add a method `get_problem_reference()` to population to extract a reference to the problem.
Solution 1 seems to me to be the one that would make our life easy also in python.
Dario Izzo
@darioizzo
May 19 2016 07:40
Solution 3 is the most correct to me in terms of C++
Dario Izzo
@darioizzo
May 19 2016 07:48
``````unknown location(0): fatal error: in "sea_algorithm_test": std::invalid_argument:
function: check_decision_vector
where: /home/dario/Documents/PaGMOreborn/tests/../include/algorithms/../problem.hpp, 1188
what: Length of decision vector is 18446744073704605556, should be 25``````
:)
Francesco Biscani
@bluescarni
May 19 2016 09:02
1 will not work in python, the current policy is to always return copies of inner members
Dario Izzo
@darioizzo
May 19 2016 09:02
So how do we do it?
Francesco Biscani
@bluescarni
May 19 2016 09:03
I could look into starting returning references in Python as well, but it scares me a bit
when I tried at the beginning of pygmo it was a nightmare, but maybe today I can do it better :)
Dario Izzo
@darioizzo
May 19 2016 09:04
At the moment I implemented 3 .... as it was quick
Francesco Biscani
@bluescarni
May 19 2016 09:04
but why not just another overload of get_problem()?
why change the name?
Dario Izzo
@darioizzo
May 19 2016 09:05
to make sure one understands what he is doing ... I too was scared
Also, can you overload on the return type?
Francesco Biscani
@bluescarni
May 19 2016 09:05
it's just like the `operator[]` of `std::vector`
Dario Izzo
@darioizzo
May 19 2016 09:06
ok so if you have two methods one marked const and one not he selects the correct one?
Francesco Biscani
@bluescarni
May 19 2016 09:06
do you have 5 minutes for the "spiegone"
Dario Izzo
@darioizzo
May 19 2016 09:06
yeahhh!!
go for it
Francesco Biscani
@bluescarni
May 19 2016 09:07
so the executive summary is:
``````probem &get_problem();
const problem &get_problem() const;``````
it helps to think that methods are syntactic sugar for plain functions
as we were mentioning yesterday
with an implicit first argument that is `this`
so these getters are really functions of this form:
``````problem &get_problem(population *this);
const problem &get_problem(const population *this);``````
so you see, the final `const` in the declaration of the methods (the first 2 above these) is really referring to the constness of `this` as it is passed as a function argument
Dario Izzo
@darioizzo
May 19 2016 09:10
reflected also in the error message (passing this as const ... blah)
Francesco Biscani
@bluescarni
May 19 2016 09:10
exactly... so the method constness is the one that allows you to do the overloading
then there is the constness of the return value
in the second getter, the method-const getter, you are forced to return a const reference
because you cannot extract a no-const reference to a class member (via `this`) if `this` is const
Dario Izzo
@darioizzo
May 19 2016 09:12
got it :)
Francesco Biscani
@bluescarni
May 19 2016 09:13
in the first getter you could in principle return a const reference
but that would defeat the purpose clearly
Dario Izzo
@darioizzo
May 19 2016 09:14
hopefully we can use this in python too ....
Francesco Biscani
@bluescarni
May 19 2016 09:14
let's see how it works in python... the problems I had were related to the island class I think
but I don't know really what happens if you do something like this:
``````>>> p = pop.get_problem();
>>> pop.set_problem(p2); # Supposing this exists``````
I have no idea how this reference counting is handled internally
Dario Izzo
@darioizzo
May 19 2016 09:18
mmm, but the problem is more:
``````p = pop.get_problem()
p.set_seed(32)``````
no?
Francesco Biscani
@bluescarni
May 19 2016 09:18
that should work
Dario Izzo
@darioizzo
May 19 2016 09:19
I mean this is what you would write in a user defined algorithm that deals with a stochastic problem
Francesco Biscani
@bluescarni
May 19 2016 09:20
yes.. in theory boost python is able to deal with this:
``````>>> p = pop.get_problem() # Gets internal reference.
>>> del pop
>>> p # This is still valid.``````
Dario Izzo
@darioizzo
May 19 2016 09:20
well I am already more optimistic then ...
Francesco Biscani
@bluescarni
May 19 2016 09:21
I have very bad recollections of the experience
let's hope
Dario Izzo
@darioizzo
May 19 2016 09:21
Thats why I am aware this is a potential issue
Francesco Biscani
@bluescarni
May 19 2016 09:22
maybe we can make it work at the lower level.. with the island you have the added problem that you return a reference to an object that is being mutated asynchronously by a thread
`p = isl.get_population()`
good luck with that
Dario Izzo
@darioizzo
May 19 2016 09:23
I thought locks can guard these cases ...
Francesco Biscani
@bluescarni
May 19 2016 09:23
how? once you have a reference there's no locking, youi just have a pointer to the class internally
``````>>> p = isl.get_population()
>>> isl.evolve()
>>> p.get_champion() # Boom``````
so the island must always copy stuff around
Dario Izzo
@darioizzo
May 19 2016 09:25
but the problem is when you use that pointer to query or do stuff .... I thought thats when you can join
Francesco Biscani
@bluescarni
May 19 2016 09:25
the population has no `join`
only the island
Dario Izzo
@darioizzo
May 19 2016 09:25
right
Francesco Biscani
@bluescarni
May 19 2016 09:25
or even:
``````>>> p = pop.get_problem()
>>> isl = island(pop)
>>> isl.evolve()
>>> # do stuff with p``````
Dario Izzo
@darioizzo
May 19 2016 09:27
ok, so copy it must be ...
Francesco Biscani
@bluescarni
May 19 2016 09:27
for the island for sure, maybe for the other classes we can get away with returning references
Dario Izzo
@darioizzo
May 19 2016 09:28
btw, when do you want to activate the pull request pipeline? We said when algo was done .... if I finish tests and docs for sea we can start the pull request loop
as far as I am concerned ...
Another option would be to wait for a minimal python version to be ready too
Francesco Biscani
@bluescarni
May 19 2016 09:29
I am for starting immediately with the PRs
Dario Izzo
@darioizzo
May 19 2016 09:30
Ok, then let me finish cosmetics on docs and tests for sea and we freeze master
Francesco Biscani
@bluescarni
May 19 2016 09:30
ok
I am a bit scared of all this reference stuff, I'd like to have a backup plan for when things break horridly
Dario Izzo
@darioizzo
May 19 2016 09:32
Francesco Biscani
@bluescarni
May 19 2016 09:33
maybe we should have forwarding methods in the population
like `set_problem_seed` etc.
Dario Izzo
@darioizzo
May 19 2016 09:33
and its the only non const of problem
so it would just be one
Francesco Biscani
@bluescarni
May 19 2016 09:34
and I hate the cognitive load of having to think about how things can now break with this reference stuff
Dario Izzo
@darioizzo
May 19 2016 09:38
If all is solved by `set_problem_seed` in population lets do it straight away ...
If we instead need to put a ton of forwarding methods here and there it may be worth to go through another option
Francesco Biscani
@bluescarni
May 19 2016 09:39
depends on how much stuff we need to add to the generic interface of problem/algorithm
Dario Izzo
@darioizzo
May 19 2016 09:40
you mean in the future?
Francesco Biscani
@bluescarni
May 19 2016 09:40
we'd need also set_verbosity and set_seed for the algo I guess
yes
Dario Izzo
@darioizzo
May 19 2016 09:40
hopefully not much ... but who knows
those would be needed in island and archi?
Francesco Biscani
@bluescarni
May 19 2016 09:41
I think the island does not need any of that, the island and the archi just deal with the asynch dispatch of things
Dario Izzo
@darioizzo
May 19 2016 09:41
so who should forward set_seed of the algo?
Francesco Biscani
@bluescarni
May 19 2016 09:41
ah I see ok... then maybe not :) I forgot the algo is not in the pop
Dario Izzo
@darioizzo
May 19 2016 09:42
So, do you want me to use the overload get_problem or to put a set_seed in population for the time being?
Francesco Biscani
@bluescarni
May 19 2016 09:43
probably the set_seed.. I am afraid that if there's a reference getter we will start using it for other things
Dario Izzo
@darioizzo
May 19 2016 09:43
k
Francesco Biscani
@bluescarni
May 19 2016 09:43
and it will be painful to remove later
Dario Izzo
@darioizzo
May 19 2016 09:44
code review is pending ....
:)
shoot
Francesco Biscani
@bluescarni
May 19 2016 09:44
because right now it's checking if the problem has constraints, multiobjective, etc.
and in that case it errors out
Dario Izzo
@darioizzo
May 19 2016 09:45
yep .. that is something that could be abstracted as it will go in each algo evolve begnning
Francesco Biscani
@bluescarni
May 19 2016 09:45
I guess the idea was that I should modify the problem if I want to do multiobjective/constraints with penalty functions etc., possibly with a meta problem
is that the idea?
Dario Izzo
@darioizzo
May 19 2016 09:46
If I want to use sea with a multiobjective problem yes, i first need to modify it via a meta so that it becomes single obj
I could use a multiobjectove solver though (like nsga II) and it would work
Francesco Biscani
@bluescarni
May 19 2016 09:47
right.. the problem I see is that if I run an archipelago with, say, NSGA and SEA for whatever reason, I need to turn the problem into single objective to make it work with SEA, even though NSGA would be able to deal with it
Dario Izzo
@darioizzo
May 19 2016 09:47
That would have to throw I think ....
Its the user that needs to make sure the problem he is solving can be tackled by all algos in an heterogeneous archi
Francesco Biscani
@bluescarni
May 19 2016 09:48
I mean, wouldn't it be better to supply some kind of policy to the SEA ctor to tell it how to deal with multiobjective for instance? like `pagmo::combine_linear_objectives` or somethinsg like this
Dario Izzo
@darioizzo
May 19 2016 09:48
In that archipelago sea should work on a decomposed version of the problem .....
But there are tons of ways to transform a multiobj in a single obj or a constrained into box bounded
Does it make sense to offer a default choice?
Francesco Biscani
@bluescarni
May 19 2016 09:49
yes, my comment is that this policy maybe should be part of the algorithm instead of making it a meta-problem
`class pagmo::decompose(const std::vector<double> &weight_vector)` for instance
I need to go brb
Dario Izzo
@darioizzo
May 19 2016 09:51
ok .. i did not get it so you will need to explain again ...
Francesco Biscani
@bluescarni
May 19 2016 10:31

I guess my point is the following: in the current system it's not possible to have an archipelago in which we have a "true" multiobjective algorithm like NSGA, and a single-objective algorithm modified to deal with multiobjective problems in some way (decomposition, etc.).The thing you can do right now is to turn a MO problem into a SO problem via a (yet-to-be-written) metaproblem, but this forces NSGA to work in single-objective mode

I was thinking that if we had a way to tell the algorithm (not the problem) how to transform MO problems in SO ones, then we could accommodate the usage I describe above

so maybe we could implement the logic to turn MO -> SO in a set of classes external to the problem/algorithm classes (like the utils for population sorting right now) and use this set of classes both in the implementation of the MO -> SO metaproblem and as construction arguments for algorithms, so you can instruct a SO algorithm how to deal with MO problems
so I could write both this:
``p = mo_2_so{multiobjective_problem{},pagmo::decompose{1.,2.,3.}};``
but also:
``algo = sea{pagmo::decompose{1.,2.,3.}};``
Francesco Biscani
@bluescarni
May 19 2016 10:37
the `decompose` class would be just a functor that combines the multiple objectives into a single objective using the weights `[1.,2.,3.]`
does any of this make sense?
Dario Izzo
@darioizzo
May 19 2016 11:14
I think what you have in mind is what would be possible with a meta - algorithm.
Example: PADE: this is a multiobjective algorithm that has a single objective algorithm as data member. In its evolve it decomposes a multiobjective problem into N single objective ones and uses its internal algorithm to solve them
Is this what you mean?
Francesco Biscani
@bluescarni
May 19 2016 11:17
but you could not do this in a generic fashion from the algorithm I believe... I think what I have written above would work, what maybe would work would be if the SO algorithm would construct on-the-fly a MO problem from the SO one and then use that
Dario Izzo
@darioizzo
May 19 2016 11:18
I do not fully like the idea of having all SO algorithm being implemented so that a default behavior is triggered when a MO problem is passed (or CON)
Francesco Biscani
@bluescarni
May 19 2016 11:18
basically I am troubled that we cannot accommodate the behaviour I have written above... I think the MO metaproblem works fine as long as you are not operating on islands and archipelagos, but it breaks down once you want to do thing heterogeneously
not all SO
it would be on a case-by-case basis, you are not forced to deal with any of that in your algorithm if you don't want to
some of the algorithm in the old pagmo had this behaviour hard-coded (turning SO into MO, turning constraints into penalty function, etc.)
Dario Izzo
@darioizzo
May 19 2016 11:20
Lets make the example of an algorithm I will here call BLUE. BLUE makes some bullshit evolve for single objective optimization and for multiobjective does the same but on decomposed problems.
Francesco Biscani
@bluescarni
May 19 2016 11:20
what I was proposing is to have a consistent way of doing it, but not forcing any algorithm to accmmoodate it
Dario Izzo
@darioizzo
May 19 2016 11:21
Is this a good example? If so I would write it by putting in evolve the detection of the number of objectives, and then I would there either run the SO case or create (on the fly) decomposed problems and run many SO cases there ...
Francesco Biscani
@bluescarni
May 19 2016 11:22
ok but the operation of decomposition does not depend on a single algorithm, it's something you can always do regardless of the algorithm
Dario Izzo
@darioizzo
May 19 2016 11:23
Yes, hence the metaproblem ....
Francesco Biscani
@bluescarni
May 19 2016 11:23
Dario Izzo
@darioizzo
May 19 2016 11:23
extracting SO problems from MO
In that case it would work as I have NSGAII working with BLUE
and both would be operating in MO mode
as BLUE would detect 2 or more objectives
Francesco Biscani
@bluescarni
May 19 2016 11:24
yes... but that is what I am proposing, just to make it dependent on a set of shared classes that we can reuse from algorithm and problem
because otherwise we will write the same code in multiple different places
Dario Izzo
@darioizzo
May 19 2016 11:25
:) I do not get it .... these shared classes for me are the meta-problems
we write only one `decompose`
Francesco Biscani
@bluescarni
May 19 2016 11:25
no you cannot meta it
Dario Izzo
@darioizzo
May 19 2016 11:25
What can you not meta?
Francesco Biscani
@bluescarni
May 19 2016 11:25
a decomposed problem will expose 1 nobj
Dario Izzo
@darioizzo
May 19 2016 11:26
yes but BLUE builds the meta problems inside its evolve
so the problem BLUE is solving is MO, but inside it uses SO problems
Francesco Biscani
@bluescarni
May 19 2016 11:26
I think what I have written above would work, what maybe would work would be if the SO algorithm would construct on-the-fly a MO problem from the SO one and then use that
and how do you specify how to decompose the problem to the algorithm?
you construct BLUE with what arguments?
Dario Izzo
@darioizzo
May 19 2016 11:28
Look at MOEA/D .. let me get you the link
Here there is an enum specifying how to generate the weights
But you can pass the weights too directly
I guess it depends on the algorithm ....
Francesco Biscani
@bluescarni
May 19 2016 11:30
but it's not only about the weights, there's other ways of turning MO into SO aren't there
that's the reason I was thinking about different classes
Dario Izzo
@darioizzo
May 19 2016 11:32
I am not sure I understand your idea ... but you are probably right. I have in mind the way I implemented this in the current PaGMO and did not encounter limitations in it.
I think you have in mind generic programming where you pass some sort of high level class that does something generic and can be used in meta problems as well as somewhere else ...
I just do not see it .... :(
Francesco Biscani
@bluescarni
May 19 2016 11:33
Dario Izzo
@darioizzo
May 19 2016 11:35
these are (essentially) already all implemented in Pagmo inside the decompose meta problem
Francesco Biscani
@bluescarni
May 19 2016 11:35
I have no idea, but it looked like it would need to pass in some generic function to implement the generic cases
Dario Izzo
@darioizzo
May 19 2016 11:37
I think you are right, there is a generic programming way of dealing with MO->SO transformations, but I guess the devil is in the details.. have a look into decompose meta, there an enum is used to use different transformations
Francesco Biscani
@bluescarni
May 19 2016 11:37
there's lots of logic in that class, it looks like it would not be easy to do the same thing in many algorithms
I was reading the cpp, where the stuff actually is
but I imagine the scheme of creating a decomposed problem on the fly works
I just hope it does not take too much code to use it in the other algorithms
Dario Izzo
@darioizzo
May 19 2016 11:40
My idea on this is that this is deeply part of the algorithm, so it would not make sense to put, say, in de the mechanics. Only there where it is known that decomposition works ....
Francesco Biscani
@bluescarni
May 19 2016 11:41
but the logic should be easy enough: if it is a multiobjective problem, create a single objective one with a way specified from the constructor in some way, and optimise that
Dario Izzo
@darioizzo
May 19 2016 11:42
Ok. what would the class pagmo::decompose{1.,2.,3.} do / contain?
Francesco Biscani
@bluescarni
May 19 2016 11:43
maybe we don't even need a class at this point.... it would be the same arguments you would pass down to the `decompose` problem you construct internally, so the question moves on how to design the `decompose` interface
Dario Izzo
@darioizzo
May 19 2016 11:44
Also, remember that when you construct an algorithm you do not know what its use will be.
So, for example, what is the dimension of the weights?
Unknown.
It will only be known inside the evolve
Francesco Biscani
@bluescarni
May 19 2016 11:45
sure... it will throw an error in that case
when you do the evolve
Dario Izzo
@darioizzo
May 19 2016 11:46
So this will work only if you already know that the algorithm will work on (say) a 2-objective problem
then you can pass a fixed length weights.
Francesco Biscani
@bluescarni
May 19 2016 11:46
yep, unless you make the interface more generic to accept a functor that computes in some way the weights given the problem as an argument
Dario Izzo
@darioizzo
May 19 2016 11:47
bleah. I still prefer, then, a meta algorithm (multiobjective) constructed with (say) de and that call de on the decomposed problem.
Francesco Biscani
@bluescarni
May 19 2016 11:48
I don't get how that would change things with respect to knowing in advance the dimension, but I'm getting confused as usual
Dario Izzo
@darioizzo
May 19 2016 11:49
It would not. But its a cleaner solution no?
Francesco Biscani
@bluescarni
May 19 2016 11:49
I'd need to see in more detail
Dario Izzo
@darioizzo
May 19 2016 11:49
Lets wait to have de, moad and decompose implemented
Then we can discuss redesign of the whole thing ...
Francesco Biscani
@bluescarni
May 19 2016 11:50
I liked the idea of having an easy way to make SO algorithms to accept MO problems, but maybe it's useless
Dario Izzo
@darioizzo
May 19 2016 11:50
I like it too, but seems its not that easy. If you have to fix the MO dimension upfront is kind of useless
And if it forces you to pass additiional stuff to construct an algo you will only use in SO is actually a nuisance
Francesco Biscani
@bluescarni
May 19 2016 11:51
well it's not that useless, if I am building an archipelago to solve a problem I know which problem dimension I will have presumably
you can have default arguments for that
Dario Izzo
@darioizzo
May 19 2016 11:51
But then I prefer the meta-problem solution
If I know the dimension of the fitness (not of the chromosome)
the weights have the same dimension as the fitness
Francesco Biscani
@bluescarni
May 19 2016 11:52
I feel like we are running around.. you talking about meta prob cted inside the evolve or outside?
Dario Izzo
@darioizzo
May 19 2016 11:54
Its simple:
• Assume we know the fitness dimension and we want to build an archipelago
• I would use a meta algorithm constructed with any SO solver. Inside its evolve the SO solver is called on a decomposed problem (weights fixed upon construction of the meta)
Francesco Biscani
@bluescarni
May 19 2016 11:54
pseudocode:
``````a = archipelago()
p = multiobjective_problem() # I know this guy has 3 objectives
a.push_back(nsga(),p)
a.push_back(sea(decompose_weights = [1.,2.,3.]),p)``````
what would be wrong with that?
Dario Izzo
@darioizzo
May 19 2016 11:57
vs:
``````a = archipelago()
p = multiobjective_problem() # I know this guy has 3 objectives
a.push_back(nsga(),p)
a.push_back(meta_algo(sea(), decompose_weights = [1.,2.,3.]), p)``````
Francesco Biscani
@bluescarni
May 19 2016 11:59
but then we have a meta-problem for the decomposition of the problem and a meta-algorithm that kinda does the same thing, not sure... it would indicate that if you want to do decomposition you have to ask yourself if you have to do it in the problem, or in the algorithm or both... I don't know
maybe I just need to get used to the idea :)
Dario Izzo
@darioizzo
May 19 2016 12:00
Thats the thing. To me the two things are very different.
Decomposing a problem or solving a MO problem by using some form of decomposition are different things
Francesco Biscani
@bluescarni
May 19 2016 12:01
fair nuff
Dario Izzo
@darioizzo
May 19 2016 12:01
But you are probably right. The main objection I have to your interface is:
writing `sea(decompose_weights = [1.,2.,3.])` I have the impression of calling the sea algorithm (as defined in the litearture).
But in pagmo I would be basically "inventing" a new algorithm
which is cool, but also a bit shaky
Francesco Biscani
@bluescarni
May 19 2016 12:03
no it sounds good, I think the meta algorithm is better now
Dario Izzo
@darioizzo
May 19 2016 12:03
why?
Francesco Biscani
@bluescarni
May 19 2016 12:04
because of what you said about the difference between decomposing and solving by decomposing... it might be a fine difference but that's exactly the kind of stuff I am clueless about
my only comment would probably be that the arguments to `meta_algo` constructor should be the same that go into decompose meta problem I guess, so we have a uniform interface for the two
(well apart from the algo argument of course)
Dario Izzo
@darioizzo
May 19 2016 12:07
Ok, then let me expand .... decomposing a problem can be done, essentially as the wikipedia page you posted say (there are a few more ideas around, but essentially thats it). Solving a problem using decomposition can be done a) as we are discussing now by applying a SO solver to many different decomposed problems b) by ad hoc algorithms such as MOEA/D (PADE too) or god knows what
b) would not be using the interface we are discussing, only the meta-problem decompose would be used somehow inside the evolve method.
This is where my experience ends. So now you know as much as I do :)
Francesco Biscani
@bluescarni
May 19 2016 12:09
I think it's nice to have the meta-algorithm for the easy/genera approach, which is probably bullshit from the point of view optimisation but it's still a good option to have
meta algorithm sorry
Dario Izzo
@darioizzo
May 19 2016 12:11
btw such a meta algorithm would be new to PaGMO 2.0. I did not implement it in PaGMO legacy
So our clients have one more reason to upgrade :) !!
Francesco Biscani
@bluescarni
May 19 2016 12:13
our precious paying customers
Dario Izzo
@darioizzo
May 19 2016 12:13
I have finished the main structure. @bluescarni @CoolRunning if you could review the code and main message critcally would be great.
Remember that this code will be cut and pasted by most algorithms so if we start with bulshitt we will end up with tons of bullshit :)
sea and algorithm ......
Francesco Biscani
@bluescarni
May 19 2016 12:15
documentation looks good, maybe I'd put the `log_type` in the public methods so the user can see exacctly what it is
what do you mean by cut & paste?
Marcus Märtens
@CoolRunning
May 19 2016 12:17
`The mutation is uniform within the box-bounds. Hence, unbounded problems will be a problem for this simple approach.` sounds a bit cheeky. In fact, what you want to say is, that it will not work and throw an error, right?
You want us to review the docs or the code itself?
Francesco Biscani
@bluescarni
May 19 2016 12:20
`const auto &bounds = prob.get_bounds();` get_bounds still returns a copy, so for consistency this should be changed as well
Francesco Biscani
@bluescarni
May 19 2016 12:25
there's no checking on the problem bounds when you are generating random chromosomes. See all the checks that are done here: https://gitlab.com/EuropeanSpaceAgency/PaGMOreborn/blob/master/include/utils/generic.hpp#L49
Dario Izzo
@darioizzo
May 19 2016 12:26
@CoolRunning Not sure what happens if you call the std::uniform_real_distribution<double>(-inf, inf)(m_e);
let me check
Francesco Biscani
@bluescarni
May 19 2016 12:26
I just wrote above... it's undefined behaviour
I think we should probably have another utility function that generates a single number with the checks
it's not only infinites, it's also lb == ub, range checks, etc.
Dario Izzo
@darioizzo
May 19 2016 12:29
But then we again have to copy the random engine?
like we do in decision_vector?
Generating a number within two bounds will be done a lot
Francesco Biscani
@bluescarni
May 19 2016 12:30
no this utility function would take the random engine as argument I guess? it would be essentially the body insside the loop here:
`offspring[j] = std::uniform_real_distribution<double>(lb[j],ub[j])(m_e);` would become `offspring[j] = random_from_range(lb[j],ub[j],m_e);`
or something like that
Dario Izzo
@darioizzo
May 19 2016 12:31
what was the problem in decision vector that forced us to not pass the random engine?
Francesco Biscani
@bluescarni
May 19 2016 12:31
I don't remember, but I don't see it as a problem right now
we don't need to use the decision_vector() full function here
Dario Izzo
@darioizzo
May 19 2016 12:33
I think we were concerned by the python interface.
Francesco Biscani
@bluescarni
May 19 2016 12:33
it's possible yeah
in this case though in python you would just call random(a,b) and don't need to worry about undefined bahviour and stuff like that (it's Python's problem)
Dario Izzo
@darioizzo
May 19 2016 12:36
should I then put this in detail and call it also from decision_vector?
@CoolRunning Also the code would need review ....
Francesco Biscani
@bluescarni
May 19 2016 12:37
if it's supposed to be used by the public API it should not be in detail I guess
I mean, it's useful for algorithm writers
Dario Izzo
@darioizzo
May 19 2016 12:37
Ok but it will not go in python ....
Francesco Biscani
@bluescarni
May 19 2016 12:37
no
and btw, in python we can just have an easy reimplementation of decision_vector in pure python.. it's not really a problem
Dario Izzo
@darioizzo
May 19 2016 12:38
k so I only care about c++ and make this efficient
Francesco Biscani
@bluescarni
May 19 2016 12:38
sure
Francesco Biscani
@bluescarni
May 19 2016 12:50
`auto mut = 0u;` this should probably be vector_double::size_type instead of auto
a bit curious why you write `unsigned int i = 1u; i <= m_gen; ++i` instead of the usual zero-based loop... is it because of the screen output?
Dario Izzo
@darioizzo
May 19 2016 12:51
its a counter its not related to the vector_double::size_tpye if not indirectly ...
Francesco Biscani
@bluescarni
May 19 2016 12:51
but it counts something which is of type size_type doesn't it?
Dario Izzo
@darioizzo
May 19 2016 12:52
right ok.
On the counter, it made sense to have gen (generations) to go from 1 to m_gen rather than 0 to m_gen -1
if it bothers us I can change
Francesco Biscani
@bluescarni
May 19 2016 12:53
no it's ok
minor style things, `count % 50 == 1u` -> `count % 50u == 1u` (in 2 places I think) and `while(mut==0)` -> `while(!mut)`
Dario Izzo
@darioizzo
May 19 2016 12:56
logs should be cleared after the throws
changed it
Francesco Biscani
@bluescarni
May 19 2016 12:58
there's also a clang failure due to variable shadowing
Dario Izzo
@darioizzo
May 19 2016 12:58
error log?
Francesco Biscani
@bluescarni
May 19 2016 12:59
Dario Izzo
@darioizzo
May 19 2016 14:24
Done and pushed.
Dario Izzo
@darioizzo
May 19 2016 15:33
log are not serialized at the moment. I see no harm in that, they will be empty when reconstructed. Makes sense for a mutable member.
Francesco Biscani
@bluescarni
May 19 2016 18:19
C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\include\tuple(142): warning C4244: 'initializing': conversion from 'unsigned __int64' to 'unsigned int', possible loss of data [C:\projects\pagmoreborn\build\tests\sea.vcxproj]
C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\include\tuple(289): note: see reference to function template instantiation 'std::_Tuple_val<_This>::_Tuple_val<_Ty>(_Other &&)' being compiled
Dario Izzo
@darioizzo
May 19 2016 18:40
e che volemo fa?
errore in tuple .... che centriamo noi?
L' altro in zdt e idiotico, ma posso correggerlo....
Francesco Biscani
@bluescarni
May 19 2016 18:43
su zdt puoi metterlo come default invece che case 6
oppure inizializzare il retval, non importa... magari mettere un assert che m_id e' <= 6 tanto per
`log_line_type(i, prob.get_fevals(), pop.get_f()[best_idx][0], improvement, mut)`
`using log_line_type = std::tuple<unsigned int, unsigned int, double, double, unsigned int>;`
`decltype(prob.get_fevals()) != unsigned int`
Francesco Biscani
@bluescarni
May 19 2016 18:48
now I set up the compiler flags in MSVC so that warnings are treated as errors there as well
so we can catch this stuff early
Dario Izzo
@darioizzo
May 19 2016 18:49
che bello!
I tried something .... lets see if I got the ugly message of MSVC
Francesco Biscani
@bluescarni
May 19 2016 18:50
actually it's a reasonable error message
kinda bummed out that clang and GCC did not catch that
Dario Izzo
@darioizzo
May 19 2016 18:51
the mismatch between unsigned types ... yes the other, mah
Francesco Biscani
@bluescarni
May 19 2016 18:51
do we have tests for the utilities, like the decision vector random generation?
Dario Izzo
@darioizzo
May 19 2016 18:51
yes
Francesco Biscani
@bluescarni
May 19 2016 18:51
what is it called?
Dario Izzo
@darioizzo
May 19 2016 18:52
let me check
have a look there are many cases treated why?
its in generic.cpp
Francesco Biscani
@bluescarni
May 19 2016 18:53
k cool thanks
did some doc fixes on the function that was split off
Dario Izzo
@darioizzo
May 19 2016 18:54
if you have doubts just add a few tests of what you would like to check explicitly .. I tried to cover the basics ....
Francesco Biscani
@bluescarni
May 19 2016 18:54
plus some minor stuff
Dario Izzo
@darioizzo
May 19 2016 18:54
checking
why you got rid of the nans message?
Francesco Biscani
@bluescarni
May 19 2016 18:56
`if (std::isinf(lb) || std::isinf(ub)) {` became `if (!std::isfinite(lb) || !std::isfinite(ub)) {`
Dario Izzo
@darioizzo
May 19 2016 18:57
so its explicit rather than implicit? ok
i guess isfinite(nan) is false right?
Francesco Biscani
@bluescarni
May 19 2016 18:58
yes.. earlier that code was embedded in a routine that would explicitly check for NaNs, so it was needed to explicitly make the check
yes
Dario Izzo
@darioizzo
May 19 2016 18:59
Well the nans were chedked by the condition ub-lb < eps
as I found out via the tests
but now its better, more explicit
Francesco Biscani
@bluescarni
May 19 2016 18:59
yes I guess it's equivalent in ieee airthmetics
Dario Izzo
@darioizzo
May 19 2016 19:00
I just hope all these checks in the random number generation will not make for a visible overhead
random number are generated a lot in meta-heuristics
Francesco Biscani
@bluescarni
May 19 2016 19:00
only one way to find out :)
Dario Izzo
@darioizzo
May 19 2016 19:01
knew it !!
the test coverage? is the website on now?
I would love to have the little box in gitlab saying the %
Francesco Biscani
@bluescarni
May 19 2016 19:02
the website I use for that has been completely fucked up for a couple of weeks now... supposedly they rolled out a big upgrade of the site (which was working real good so far) and broke everything
I'll wait a bit and see what happens
Dario Izzo
@darioizzo
May 19 2016 19:02
so its still down :(
Francesco Biscani
@bluescarni
May 19 2016 19:03
yes
it's really shitty, it used to be a great servic
Dario Izzo
@darioizzo
May 19 2016 19:04
build success :) !
Francesco Biscani
@bluescarni
May 19 2016 19:05
:+1:
Dario Izzo
@darioizzo
May 19 2016 19:06
for the test of the evolve method of algorithms I am thinking to just do a nothrow / throw
the actual perfromances cannot be tested .... if not by benchmarking with the usual bullshit significance and stats
Francesco Biscani
@bluescarni
May 19 2016 19:07
how about just checking it solves the 2D rosenbrock problem? Just pick a seed that works :)
Dario Izzo
@darioizzo
May 19 2016 19:08
Is the seed guaranteed to generate the same pseudorandom seq in all platforms?
(btw, sea does not solve even the 2d rosenbrock ... it gets to 0.01 after 1000 gens :)
Francesco Biscani
@bluescarni
May 19 2016 19:09
so the random engines are guaranteed to produce the same sequence
point is that `unsigned int` might not be the same range on all platforms
Dario Izzo
@darioizzo
May 19 2016 19:09
so it could overflow in some?
Francesco Biscani
@bluescarni
May 19 2016 19:10
it cannot in that sense give you the same pseudo random sequence, I mean, if it is 16bit on a platform and you require a number it cannot be 2^20
whereas on another platform you could get 2^20
Dario Izzo
@darioizzo
May 19 2016 19:10
ah .. right so the actual random number drawn will be different
same for the IDs of the inds
Francesco Biscani
@bluescarni
May 19 2016 19:11
yes... that said, unsigned is 32 bit mostly everywhere and if we ever run in a case in which it does not work we can still change the test to a better number :)
Dario Izzo
@darioizzo
May 19 2016 19:11
the point is that what would we be testing? if it does not solve rosenbrock 2D what does it tell us?
Francesco Biscani
@bluescarni
May 19 2016 19:11
but as you prefer
we should have a linear problem to solve
optimise y = x
:)
Dario Izzo
@darioizzo
May 19 2016 19:12
Still tough for sea to get to 10^-13
Francesco Biscani
@bluescarni
May 19 2016 19:12
no it's ok do as you prefer
Dario Izzo
@darioizzo
May 19 2016 19:12
it will get to 10-2 maybe 10^-3
its a global algo ... locally is shit
(actually also globally, its just there as its simple anc could highlight the other coding challenges)
It may (MAY) be good for stochastic problems, but I doubt
But its good to have it as its a reference in the literature :)
Francesco Biscani
@bluescarni
May 19 2016 19:21
yeps
those pagmo problems with the boost linking look a bit insane
I wonder what people are doing with those, I never had one issue nor the symengine folks linking to boost and piranha
Marcus Märtens
@CoolRunning
May 19 2016 19:22
how come this guy is not finding an identifier although I clearly included the corresponding header?
Dario Izzo
@darioizzo
May 19 2016 19:24
which guy?
Marcus Märtens
@CoolRunning
May 19 2016 19:24
my hypervolume algorithm
Dario Izzo
@darioizzo
May 19 2016 19:25
:) any code to share?
Marcus Märtens
@CoolRunning
May 19 2016 19:26
I don't know... its a bit much
I have one file `hv2d.hpp` which defines class `hv2d`
In another file (`hvwfg.hpp`) I do `#include "hv2d.hpp"`
both files are at the same folder
Now there is one line `return hv2d().compute(points, n_points, m_refpoint);`
And he tells me: identifier `hv2d` not found
:worried:
Dario Izzo
@darioizzo
May 19 2016 19:28
`#ifdef`is correct?
Marcus Märtens
@CoolRunning
May 19 2016 19:29
I have it... something like
``````#ifndef PAGMO_UTIL_HV_ALGORITHMS_HV2D_H
#define PAGMO_UTIL_HV_ALGORITHMS_HV2D_H``````
And the corresponding one in the other file
ah wait
I changed the name
that could be it
no... it wasnt
Dario Izzo
@darioizzo
May 19 2016 19:30
if its the same it will be skipped ... I often forget to change it (cut and paste)
Marcus Märtens
@CoolRunning
May 19 2016 19:30
In the other file it is different:
Dario Izzo
@darioizzo
May 19 2016 19:31
yes
Marcus Märtens
@CoolRunning
May 19 2016 19:31
``````
I hate this notation
``````#ifndef PAGMO_UTIL_HV_ALGORITHMS_HVWFG_H
#define PAGMO_UTIL_HV_ALGORITHMS_HVWFG_H``````
Dario Izzo
@darioizzo
May 19 2016 19:31
and the endif at the end ... ok then it should be ok
hv2d().blah, just humor me .... try hv2d{}.blah
also can you paste the exact error message?
Marcus Märtens
@CoolRunning
May 19 2016 19:33
no... same error... same for the hypervolume class itself - seems to be not included?
``````1>------ Build started: Project: hypervolume, Configuration: Debug x64 ------
1>  hypervolume.cpp
1>c:\pagmo\pagmoreborn\include\utils\../utils/hv_algorithms/hv_algorithm.hpp(99): warning C4267: '=': conversion from 'size_t' to 'unsigned int', possible loss of data
1>c:\pagmo\pagmoreborn\include\utils\../utils/hv_algorithms/hv_algorithm.hpp(434): warning C4267: '=': conversion from 'size_t' to 'unsigned int', possible loss of data
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(255): error C2065: 'hv2d': undeclared identifier
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(255): error C2228: left of '.compute' must have class/struct/union
1>  c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(255): note: type is 'unknown-type'
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(266): error C2065: 'hypervolume': undeclared identifier
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(266): error C2146: syntax error: missing ';' before identifier 'hv'
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(266): error C2065: 'hv': undeclared identifier
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(267): error C2065: 'hv': undeclared identifier
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(267): error C2228: left of '.set_copy_points' must have class/struct/union
1>  c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(267): note: type is 'unknown-type'
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(268): error C2065: 'hv': undeclared identifier
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(268): error C2228: left of '.compute' must have class/struct/union
1>  c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(268): note: type is 'unknown-type'
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(323): warning C4267: '=': conversion from 'size_t' to 'unsigned int', possible loss of data
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp(324): warning C4267: '=': conversion from 'size_t' to 'unsigned int', possible loss of data
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hv3d.hpp(166): warning C4267: 'initializing': conversion from 'size_t' to 'unsigned int', possible loss of data
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\../hypervolume.hpp(214): warning C4267: 'initializing': conversion from 'size_t' to 'unsigned int', possible loss of data
1>c:\pagmo\pagmoreborn\include\utils\hv_algorithms\../hypervolume.hpp(215): warning C4267: 'initializing': conversion from 'size_t' to 'unsigned int', possible loss of data
========== Build: 0 succeeded, 1 failed, 23 up-to-date, 0 skipped ==========``````
Dario Izzo
@darioizzo
May 19 2016 19:35
clearly you are not including it somehow .... filename mess?
Marcus Märtens
@CoolRunning
May 19 2016 19:35
Dunno... I must have messed up something
Dario Izzo
@darioizzo
May 19 2016 19:35
is the clas called hv2d ?
can you post the class declaration?
Marcus Märtens
@CoolRunning
May 19 2016 19:36
``````namespace pagmo {

/// hv2d hypervolume algorithm class
/**
* This is the class containing the implementation of the hypervolume algorithm for the 2-dimensional fronts.
* This method achieves the lower bound of n*log(n) time by sorting the initial set of points and then computing the partial areas linearly.
*
* @author Krzysztof Nowak (kn@kiryx.net)
*/
class hv2d : public hv_algorithm
{
public:``````
Dario Izzo
@darioizzo
May 19 2016 19:37
and the file is in the same folder where its included? no multiple files with same neames in different folders?
just trying the classic bullshit mistakes I normally do :)
Marcus Märtens
@CoolRunning
May 19 2016 19:37
It must be bullshit, for sure
But I seem unable to figure it out :worried:
same bullshit with hypervolume
they are on different levels
but I account for that
file goes like this:
``````#ifndef PAGMO_UTIL_HV_ALGORITHMS_HVWFG_H
#define PAGMO_UTIL_HV_ALGORITHMS_HVWFG_H

#include <iostream>
#include <vector>
#include <cmath>
#include <algorithm>
#include <iterator>

#include "../../io.hpp"
#include "../../types.hpp"
#include "../../exceptions.hpp"

#include "../hypervolume.hpp"
#include "hv_algorithm.hpp"
#include "hv2d.hpp"

namespace pagmo {

/// WFG hypervolume algorithm
/**
* This is the class containing the implementation of the WFG algorithm for the computation of hypervolume indicator.
*
* @see "While, Lyndon, Lucas Bradstreet, and Luigi Barone. "A fast way of calculating exact hypervolumes." Evolutionary Computation, IEEE Transactions on 16.1 (2012): 86-95."
* @see "Lyndon While and Lucas Bradstreet. Applying the WFG Algorithm To Calculate Incremental Hypervolumes. 2012 IEEE Congress on Evolutionary Computation. CEC 2012, pages 489-496. IEEE, June 2012."
*
* @author Krzysztof Nowak (kn@linux.com)
* @author Marcus Märtens (mmarcusx@gmail.com)
*/
class hvwfg : public hv_algorithm
{
public:``````
Dario Izzo
@darioizzo
May 19 2016 19:39
all files end with `#endif` right?
Marcus Märtens
@CoolRunning
May 19 2016 19:41
they do - just checked again
ah well
Dario Izzo
@darioizzo
May 19 2016 19:42
are they all in the same namespace pagmo ?
Marcus Märtens
@CoolRunning
May 19 2016 19:42
I have some circular inclusion
I guess that could mess up things?
Dario Izzo
@darioizzo
May 19 2016 19:42
yes ...
where is the circular inclusion>
?
Marcus Märtens
@CoolRunning
May 19 2016 19:43
hvwfg -> hv2d
hv2d -> hv3d
hv3d -> hvwfg
how did Kiryx do this?
:confused:
Dario Izzo
@darioizzo
May 19 2016 19:45
so, in hv3s where is hvwfg used? could you just do a forward declaration?
Marcus Märtens
@CoolRunning
May 19 2016 19:46
it is only used once
``````            // Point is dominated
if (p[i][1] >= (*it).first[1]) {
return hvwfg(2).contributions(points, r_point);
}``````
Dario Izzo
@darioizzo
May 19 2016 19:46
and hv3d in hv2d ?
Marcus Märtens
@CoolRunning
May 19 2016 19:47
``````    std::vector<double> contributions(std::vector<vector_double> &points, const vector_double &r_point) const
{
std::vector<vector_double> new_points(points.size(), vector_double(3, 0.0));
vector_double new_r(r_point);
new_r.push_back(1.0);

for (unsigned int i = 0; i < points.size(); ++i) {
new_points[i][0] = points[i][0];
new_points[i][1] = points[i][1];
new_points[i][2] = 0.0;
}
// Set sorting to off since contributions are sorted by third dimension
return hv3d(false).contributions(new_points, new_r);
}``````
And this is where hv2d is used in hvwfg:
``````        // If already sliced to dimension at which we use another algorithm.
if (m_current_slice == m_stop_dimension) {

if (m_stop_dimension == 2) {
// Use a very efficient version of hv2d
return hv2d().compute(points, n_points, m_refpoint);
}
else {``````
Dario Izzo
@darioizzo
May 19 2016 19:49
Ok one last try:
in hvwg remove the include of hv2d and add at the beginning:
Marcus Märtens
@CoolRunning
May 19 2016 19:50
I think it worked back then since the whole thing was separated into .h and .cpp
Dario Izzo
@darioizzo
May 19 2016 19:50
``````class hv2d : public hv_algorithm
{
public:
PROTOTYPE DECLARATION OF COMPUTE
}``````
Marcus Märtens
@CoolRunning
May 19 2016 19:51
lets try
Dario Izzo
@darioizzo
May 19 2016 19:51
only the prototype
Marcus Märtens
@CoolRunning
May 19 2016 19:53
Error C2259 'pagmo::hv2d': cannot instantiate abstract class hypervolume c:\pagmo\pagmoreborn\include\utils\hv_algorithms\hvwfg.hpp 262
Dario Izzo
@darioizzo
May 19 2016 19:53
add the prototype for the constructor
Marcus Märtens
@CoolRunning
May 19 2016 19:55
same
did not help
Dario Izzo
@darioizzo
May 19 2016 19:56
yeah I guessed. Its a 'design' flaw somewhere, I guess @bluescarni may know how to do it correctly .... hopefully without redesigning the whole thing
Marcus Märtens
@CoolRunning
May 19 2016 19:56
mhh.. crap
it worked with the .h and .cpp structure
Francesco Biscani
@bluescarni
May 19 2016 19:57
you could commit it to a branch and then we can take a better look
Marcus Märtens
@CoolRunning
May 19 2016 19:58
My git-skills are lacking as well :tired_face:
I guess I have to create a branch locally and push that thing?
Francesco Biscani
@bluescarni
May 19 2016 19:59
for now you can do that, when we start doing the PR thing it will be slightly different
`git checkout -b my_stuff`
`git push -u origin my_stuff`
that should do it for now
Dario Izzo
@darioizzo
May 19 2016 20:00
in git > 1.7.0
Marcus Märtens
@CoolRunning
May 19 2016 20:00
Now the cascade of problems starts...
include/external/pybind11/tools/clang: needs merge
error: you need to resolve your current index first
Francesco Biscani
@bluescarni
May 19 2016 20:01
ah right
did you do a git pull before this?
Marcus Märtens
@CoolRunning
May 19 2016 20:01
let me do it... but I still have that little hack in my CMakelists.txt
Francesco Biscani
@bluescarni
May 19 2016 20:02
it's ok we can try to fix it now
Marcus Märtens
@CoolRunning
May 19 2016 20:02
error: Pull is not possible because you have unmerged files.
hint: Fix them up in the work tree, and then use 'git add/rm <file>'
hint: as appropriate to mark resolution and make a commit.
fatal: Exiting because of an unresolved conflict.
git status says
Francesco Biscani
@bluescarni
May 19 2016 20:02
does it tell you which file has the conflict?
Marcus Märtens
@CoolRunning
May 19 2016 20:02
``````Unmerged paths:
(use "git reset HEAD <file>..." to unstage)
(use "git add <file>..." to mark resolution)

Francesco Biscani
@bluescarni
May 19 2016 20:03
ok
gimme a sec
try this
from pagmo root directory
`mv include/external/pybind11 include/external/pybind11_old`
(sorry there was a typo, should be ok now)
Marcus Märtens
@CoolRunning
May 19 2016 20:05
I renamed the folder (I am using windows)
Francesco Biscani
@bluescarni
May 19 2016 20:05
ok
next should be this `git submodule deinit include/external/pybind11`
you committed your changes to your local branch right? that didn't fail?
Marcus Märtens
@CoolRunning
May 19 2016 20:06
C:\PaGMO\PaGMOreborn>git submodule deinit include\external\pybind11
fatal: no submodule mapping found in .gitmodules for path 'include/external/pybind11/tools/clang'
Haven't committed today yet
Did I already mention, that git and me hate each other?
Francesco Biscani
@bluescarni
May 19 2016 20:07
ah ok... but so when did the error message appear?
when you did the `git pull` ?
Marcus Märtens
@CoolRunning
May 19 2016 20:07
I thought it was a good idea to have recent versions so I pulled at some point
I think yesterday or the day before
that created the conflict and I was really not in the mood to fix it right away as there was still enough work to do before I wanted to push
(+ I did not understood wtf happened ofc)
Francesco Biscani
@bluescarni
May 19 2016 20:08
mh crap
is your stuff in separate dirs/files right now or do you have other changes influencing the rest of pagmo?
sorry network went down a moment
Marcus Märtens
@CoolRunning
May 19 2016 20:14
my stuff is all in the utils subfolder, if that is what you mean?
Francesco Biscani
@bluescarni
May 19 2016 20:15
yes, you didn't modify other bits of pagmo apart from the CMake stuff?
Marcus Märtens
@CoolRunning
May 19 2016 20:15
nope, didnt
well
there is the test
so there is one hypervolume.cpp in tests
Francesco Biscani
@bluescarni
May 19 2016 20:15
but it's a separate file isn't it?
Marcus Märtens
@CoolRunning
May 19 2016 20:15
aye
I think haven't touched anything else
Francesco Biscani
@bluescarni
May 19 2016 20:16
ok then rename the current pagmo dir to pagmo_old and re-clone the repository
Marcus Märtens
@CoolRunning
May 19 2016 20:16
okay
done
shall I add my files back?
Francesco Biscani
@bluescarni
May 19 2016 20:18
yes but first let's make your branch
`git checkout -b hypervolume`
Marcus Märtens
@CoolRunning
May 19 2016 20:18
okay
done
Francesco Biscani
@bluescarni
May 19 2016 20:19
`git push -u origin hypervolume`
Marcus Märtens
@CoolRunning
May 19 2016 20:20
aye
Francesco Biscani
@bluescarni
May 19 2016 20:20
ok now copy your stuff back and add it with `git add ...`
Marcus Märtens
@CoolRunning
May 19 2016 20:22
maybe I should try to compile this again...
Francesco Biscani
@bluescarni
May 19 2016 20:24
you can commit and push in the meantime
Marcus Märtens
@CoolRunning
May 19 2016 20:29
well, the whole thing is a mess now... and does not compile
But I will commit
Francesco Biscani
@bluescarni
May 19 2016 20:29
yeah no prob, push it and then we can take a look at the code
Marcus Märtens
@CoolRunning
May 19 2016 20:31
Okay, pushed
Francesco Biscani
@bluescarni
May 19 2016 20:34
k gonna start doing some cleanup to see how far I can get with the compilation here
Marcus Märtens
@CoolRunning
May 19 2016 20:36
good luck :D
Francesco Biscani
@bluescarni
May 19 2016 20:36
:)
Marcus Märtens
@CoolRunning
May 19 2016 20:51
I get tons of linker-errors now... this sucks. Will hold my hands still until you have done your magic ;)
Francesco Biscani
@bluescarni
May 19 2016 20:51
you probably have to re-do the boost setup like you did originally on windows I guess?
Marcus Märtens
@CoolRunning
May 19 2016 20:52
I guess
Dario Izzo
@darioizzo
May 19 2016 20:53
the class hv_algorithm has not data members right?
Francesco Biscani
@bluescarni
May 19 2016 20:53
``````static double volume_between(const vector_double &a, const vector_double &b, unsigned int dim_bound = 0)
{
if (dim_bound == 0) {
dim_bound = a.size();
}``````
here the compiler is complaining that `dim_bound` and `a.size()` have different sizes, dim_bound is 32 bit and a.size() is 64 bit
Dario Izzo
@darioizzo
May 19 2016 20:53
I changed almost all of them to vector::size_type ... there are a ton
Francesco Biscani
@bluescarni
May 19 2016 20:54
did you fix the virtual destructor as well?
Dario Izzo
@darioizzo
May 19 2016 20:55
yes, and added a copy constructor for hv_algorithm ... but I do not get why a class is without data ma=embers ...
Marcus Märtens
@CoolRunning
May 19 2016 20:55
@bluescarni didn't know how to fix the warning... the code was sort of not intended for commitment yet :D
hv_algorithm is a base-class
it was named "base" before
Dario Izzo
@darioizzo
May 19 2016 20:55
I think kiryx radnomly put ints and unsigned ints here and there ... like I also am used to :)
Francesco Biscani
@bluescarni
May 19 2016 20:56
it's good @CoolRunning, it's actually better that we can look at it now.. it's easier to digest a bit at a time
Marcus Märtens
@CoolRunning
May 19 2016 20:56
It defines a common interface for implementing hv2d etc.
subclasses from hv_algorithm actually do have data-members
i.e. hvwfg.hpp has a lot
Dario Izzo
@darioizzo
May 19 2016 21:15
`int r = (*it).second;`
in hv3d, does r need to be int?
what is it? can i safely change it to unsigned?
Marcus Märtens
@CoolRunning
May 19 2016 21:16
I guess auto will do here?
I have no clue :D
Marcus Märtens
@CoolRunning
May 19 2016 21:21
`typedef std::multiset<std::pair<vector_double, int>, hycon3d_tree_cmp > tree_t;`
There is an int in here, so I guess the iterator needs to be int?
Dario Izzo
@darioizzo
May 19 2016 21:22
yep which is a problem when you call the [] operator ....
love kiryx variable names ... a,b,c,d,r,t,l,d
Marcus Märtens
@CoolRunning
May 19 2016 21:23
basic and succinct.
ahhh
I doubt that this is used for something negative
though I honestly don't know
Dario Izzo
@darioizzo
May 19 2016 21:24
probably not ... but honestly the number of int unsigned int conversions that need fixing is rather large .... if this will not pass the test you will have fun debugging :)
Marcus Märtens
@CoolRunning
May 19 2016 21:25
You wouldn't believe how much fun I already have with all of this :D
Dario Izzo
@darioizzo
May 19 2016 21:25
@bluescarni any joy?
Francesco Biscani
@bluescarni
May 19 2016 21:26
if the goal is to get the code up and running quickly we can disable selectively the signed/unsigned conversion stuff before touching too much
I haven't touched anything apart from the initial dtor thing
Dario Izzo
@darioizzo
May 19 2016 21:26
i think thats a good approach, otherwise we risk to insert bugs
Francesco Biscani
@bluescarni
May 19 2016 21:26
maybe it's better to start off with something that works with minimal modifications
Dario Izzo
@darioizzo
May 19 2016 21:27
yep ... I will stash my modifications. Anyway the circular dependency problem is there ..
Francesco Biscani
@bluescarni
May 19 2016 21:29
ok I'll disable the warnings then
do you commit the dtor stuff?
Marcus Märtens
@CoolRunning
May 19 2016 21:30
Well, an easy fix would be to disable the contributions-method in hv2d and hv3d
there is a naive implementation of the contributions method in the base class (hv_algorithm), so it is not needed necessarily
i.e. outcommenting contributions in hv2d will make hv2d not dependent on hv3d any more
Dario Izzo
@darioizzo
May 19 2016 21:31
no should I? I think I have a working set up if we disable warnings
Francesco Biscani
@bluescarni
May 19 2016 21:32
what other changes did you do?
Dario Izzo
@darioizzo
May 19 2016 21:32
copy constructor, constructor call the constructor of the base class from the derived.
thats it
Francesco Biscani
@bluescarni
May 19 2016 21:32
atom here reformatted the file erasing the spaces, so you should commit the changes - but not the signed unsigned stuff
you know how to add only partial stuff?
Dario Izzo
@darioizzo
May 19 2016 21:33
I stashed everything and just modified the four things
Francesco Biscani
@bluescarni
May 19 2016 21:33
Dario Izzo
@darioizzo
May 19 2016 21:34
have a look
Francesco Biscani
@bluescarni
May 19 2016 21:36
hv_algo has no meber s righT?
no members
Marcus Märtens
@CoolRunning
May 19 2016 21:36
it does not
but subclasses of hv_algorithm do have
Dario Izzo
@darioizzo
May 19 2016 21:40
how to disable the signed unsigned errors?
Francesco Biscani
@bluescarni
May 19 2016 21:42
well down to 8 errors now
``````#if defined(__clang__) || defined(__GNUC__)
#pragma GCC diagnostic push
#pragma GCC diagnostic ignored "-Wunused-variable"
#pragma GCC diagnostic ignored "-Wshorten-64-to-32"
#endif``````
things like this
Marcus Märtens
@CoolRunning
May 19 2016 21:47
I am confident, I have some errors here now that you do not have :P
Francesco Biscani
@bluescarni
May 19 2016 21:47
`hypervolume(const hypervolume &hv) : m_points(hv.m_points), m_copy_points(hv.m_copy_points), m_verify(hv.m_verify) { }` there are no other members in this class?
Marcus Märtens
@CoolRunning
May 19 2016 21:47
correct
Dario Izzo
@darioizzo
May 19 2016 21:48
@bluescarni where? I put it in hypervolume.cpp but nothing happened :(
Francesco Biscani
@bluescarni
May 19 2016 21:48
`BOOST_CHECK_THROW(hv4 = hypervolume(pop2, true), std::invalid_argument);` this is an assignment test or a mistyped comparison test?
@darioizzo you have to put it before the body of code that you want to ignore warnings for, then you have to close it with:
``````#if defined(__clang__) || defined(__GNUC__)
#pragma GCC diagnostic pop
#endif``````
Marcus Märtens
@CoolRunning
May 19 2016 21:49
this assignment should not be possible, as pop2 is a single-objective thingy
Francesco Biscani
@bluescarni
May 19 2016 21:50
you have to put explicitly the warnings you want to have ignore (check the compiler output)
ok just making sure you really need an assignment operator for hypervolume
Marcus Märtens
@CoolRunning
May 19 2016 21:50
so this is intentional a throw-check
you do not
I think `BOOST_CHECK_THROW(hypervolume(pop2, true), std::invalid_argument);`
Francesco Biscani
@bluescarni
May 19 2016 21:50
well it's needed for the test I mean :)
Marcus Märtens
@CoolRunning
May 19 2016 21:51
should work as well?
I never use hv4
Francesco Biscani
@bluescarni
May 19 2016 21:51
ok yeah it works
so I am down to 3 non-warning related errors
can I commit @darioizzo ?
Dario Izzo
@darioizzo
May 19 2016 21:52
yes
go for it
Francesco Biscani
@bluescarni
May 19 2016 21:53
alright done
so let's take a look at this circular dep then
Dario Izzo
@darioizzo
May 19 2016 21:54
yep ...
Marcus Märtens
@CoolRunning
May 19 2016 21:54
as pointed out: disabling contributions should temporarily fix this
though it would be more efficient to have the specific contributions methods ofc
Francesco Biscani
@bluescarni
May 19 2016 21:59
so the include chain is: hvwfg includes h2vd includes hv3d includes hvwfg
Marcus Märtens
@CoolRunning
May 19 2016 21:59
correct
hv3d has a very efficient implementation of the contributions-method... it is so efficient, that it is also good for 2 dimensions.
Thus, in hv2d a third dimension of all zeros is added and the rest is handled by hv3d.
hvwfg needs hv2d as WFG is a recursive algorithm and the 2-dimensional case is sort of the stop of the recursion
Francesco Biscani
@bluescarni
May 19 2016 22:01
if they all need each other they should probably go into the same file then
so the way I'd do it
Marcus Märtens
@CoolRunning
May 19 2016 22:02
I am not quite sure why hv3d needs hvwfg though
Francesco Biscani
@bluescarni
May 19 2016 22:02
in order to avoid having a gigantic file
would be to put physically the three files in the detail/ directory, then having a single header file that includes them all (which is what the user is supposed to include)
`return hvwfg(2).contributions(points, r_point);`
here
Marcus Märtens
@CoolRunning
May 19 2016 22:04
Yes, I know the point, but I do not understand why...
Your solution sounds possible, though it would scatter the files a bit around
Francesco Biscani
@bluescarni
May 19 2016 22:05
there's still the need to shuffle around methods though
but it should be doable
Dario Izzo
@darioizzo
May 19 2016 22:06
need to go .... see you tomorrow ....
Francesco Biscani
@bluescarni
May 19 2016 22:06
gnight
I think I need to go as well
will continue tomorrow, I think we are at a good point
Marcus Märtens
@CoolRunning
May 19 2016 22:07
Okay - thx and good n8 guys
Francesco Biscani
@bluescarni
May 19 2016 22:07
night!