Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
Pierre-Henri Wuillemin
@phwuill_gitlab
hello
Lionel
@lionel.torti_gitlab
Hello !
Pierre-Henri Wuillemin
@phwuill_gitlab
Welcome to the brand new aGrUMers/Lobby. :-)
Pierre-Henri Wuillemin
@phwuill_gitlab
Lars
@Lars24679295_twitter
Hey. I recently discovered pyAgrum, and I think it is a really cool software! I was hoping you could help me out with one issue: I reinstalled Anaconda on my Windows 10 computer today, and also got the latest pyAgrum 0.13.3. Everything seems to work as before when I had 0.13.1, except for one thing: gum.loadBN is not able to load .net files (nor .dsl files, but .bif files work) and the kernel dies (try running e.g. the 01-tutorial on your webpage). Do you guys have any ideas as to what could be wrong?
Pierre-Henri Wuillemin
@phwuill_gitlab

Hi Lars, thanks for the kind words :-) I am trying to reproduce your bug.
For now, the code

import pyAgrum as gum

gum.about()

bn=gum.fastBN("A->B<-C->D->A")
bn.saveBIF("one.bif")

bn2=gum.loadBN("one.bif")
bn2.saveDSL("two.dsl")

bn3=gum.loadBN("two.dsl")
bn3.saveNET("three.net")

bn4=gum.loadBN("three.net")

if bn==bn2:
  print("ok2")
if bn==bn3:
  print("ok3")
if bn==bn4:
  print("ok4")

bn5=gum.fastBN("A->B->C->D")
if bn==bn5:
  print("not ok5")
else:
  print("ok5")

works on linux and anaconda macOS64 ... I am updating anaconda on a windows computer before trying it

Pierre-Henri Wuillemin
@phwuill_gitlab
OK, i can reproduce it on windows. Weird... We'll try to find the bug for the next release (within a week I hope). Thank you !
Lionel
@lionel.torti_gitlab
I am not having this issue when using the python shipped with scoop. This could be an issue with the conda-forge package. @Lars24679295_twitter can you try installing pyAgrum using pip and not conda ?
Pierre-Henri Wuillemin
@phwuill_gitlab

I confirm that

conda uninstall pyagrum
pip install pyagrum

is a solution for now (or a workaround).

Lars
@Lars24679295_twitter
Thank you so much for your quick reply, it works now on my computer :)
Pierre-Henri Wuillemin
@phwuill_gitlab
:+1:
Julien Schueller
@jschueller
FYI I sent the patch of MR#156 to coco/R
Pierre-Henri Wuillemin
@phwuill_gitlab
:+1:
Pierre-Henri Wuillemin
@phwuill_gitlab
Pierre-Henri Wuillemin
@phwuill_gitlab
Pierre-Henri Wuillemin
@phwuill_gitlab
Julien Schueller
@jschueller
Why the change of license ?
Pierre-Henri Wuillemin
@phwuill_gitlab

Hi Julien.

This is a move that we planned long time ago already. LGPL does not change anything for current users. We do it now because LGPL is considered by several contacts as an advantage in order to choose aGrUM/pyAgrum (more precisely : GPL seen as a drawback).

Francisco J. Camacho
@pachocamacho1990
Hi everyone ! I 've been looking for a solid library with pre-defined methods for building Bayesian Networks, thank god I found you guys. I was wondering if you have something for performing parameter learning for continuous (quasi-continuous) variables . ? Examples I have found so far in the notebooks concern only to categorical variables.
Pierre-Henri Wuillemin
@phwuill_gitlab
image.png

Hi Francisco, thanks for the kind words! We do not deal with continuous variables for now in aGrUM... mainly because we do not feel very confident in the existing models (like CLG, etc.) for Continuous graphical models. We are working on different extensions or collaborations with other libraries to provide a good way t o deal with continuous variable in graphical models.

For now, you can of course learn quasi-continuous BN from continuous data (see above)

But be aware that the choice of the discretization may have a large impact on the structure of the final BN
Francisco J. Camacho
@pachocamacho1990
Thank you for the kind and quick reply, I tried as you said. Instead of discretizing the variable using a homogeneous spacing with linspace() or arange() I used a customized non-homogeneous spacing because my variables have a range of values of logaritmic scale !. I finally manage to train a BN with such a discretization scheme. I also noticed that the granularity of the discretization scheme has a strong impact in the resulting BN structure, is there any theory or study regarding this connection ?
Pierre-Henri Wuillemin
@phwuill_gitlab

Hi Francisco,

First, before analyzing and using your BN, I suggest you either to wait for the new tag "0.15.3" (during this week hopefully) or (if you use pip to download pyAgrum) to install pyAgrum-nightly : I found a nasty bug in the Parameter learning algorithm when at least one configuration of the parents of a node is not found in the database. It is fixed in the master of our gitlab site but not deployed. So the results you have for now may be erroneous (in term of learned parameters). If you prefer, a quick workaround should be to add a small prior (Laplace adjustment) with : learner.useAprioriSmoothing(1e-5) which guarantees that no parents configuration is unknown.

Second, for your choice of discretization, this is exactly the reason why we do not do it ourself :-) the correct discretiezation is directly related to the base and the user.
Concerning the impact of the granularity, I do not know any theory on that kind but it is quite well known that indepency test (such as chi2) and scores are not very robust w.r.t this granularity.
You may try the learning algorithm based on mutual information instead of scores or chi2 in aGrUM : miic (learner.useMIIC()) which should be a bit more robust.

However, it is important to note also that a high granularity (with many values for the discrete variable) will need much more data for the learning phase (structure and parameters).

Francisco J. Camacho
@pachocamacho1990
Hi Pierre,
Yeah, I noticed that the inference results were weird in most cases, though. I tried the learner.useAprioriSmoothing(1e-5) with some results 'apparently better', thanks again !!
Pierre-Henri Wuillemin
@phwuill_gitlab
Nice to hear... The next tag should come up (around) next monday
With such a very weak prior, the results are very close to ML estimation for the probabilities.
(you can make it smaller ! :-) )