Where to chat about modeling with Bayesian Networks and integrating with agrum/pyAgrum (https://www.agrum.org)
Hi @leeningzzu,
Learning a dBNs is not difficult if you know how to learn BNs :-) It is mainly a database transformation problem. This code is not in pyAgrum since there is no canonical format for stochastic data that we know of.
Suppose you follow the variables A,B,C,...
dynamically. Let us call X(t)=(A(t),B(t),C(t),...)
the set of variables at time t
. So your data for the learning process is a set of {X_k(0),...,X_k(T_k), for k in 1...N}
(so there are N
different trajectories with possibly different lengths T_1,...,T_N
).
Then you have to construct the database learn0=(X_1(0),X_2(0),...,X_N(0))
to learn the structure and parameters of the dBN at T=0
,
and you have to build learnTrans=([X_k(t),X_k(t+1)] for all k, for all t)
to learn the structure and parameters of the dBN transition model.
Once the databases have been created, the 2 points are just the applications of pyAgrum's learning algorithms to produce BN_0
and BN_transition
. This being done, you just have to copy structures and parameters of this 2 learned BNs into a dBN and it is done. :-)
Hi @leeningzzu,
Learning a dBNs is not difficult if you know how to learn BNs :-) It is mainly a database transformation problem. This code is not in pyAgrum since there is no canonical format for stochastic data that we know of.Suppose you follow the variables
A,B,C,...
dynamically. Let us callX(t)=(A(t),B(t),C(t),...)
the set of variables at timet
. So your data for the learning process is a set of{X_k(0),...,X_k(T_k), for k in 1...N}
(so there areN
different trajectories with possibly different lengthsT_1,...,T_N
).
Then you have to construct the database
learn0=(X_1(0),X_2(0),...,X_N(0))
to learn the structure and parameters of the dBN atT=0
,and you have to build
learnTrans=([X_k(t),X_k(t+1)] for all k, for all t)
to learn the structure and parameters of the dBN transition model.Once the databases have been created, the 2 points are just the applications of pyAgrum's learning algorithms to produce
BN_0
andBN_transition
. This being done, you just have to copy structures and parameters of this 2 learned BNs into a dBN and it is done. :-)
Thanks, Pierre!
Hi @yiruiZhao ,
Sorry for the slightly late answer, this algo (CNLoopyPropagation) is a specific L2U-based algo (binarization is a non-trivial process that deserves to be treated in a non-generic way). We have slightly optimized L2U (as well as our version of MonteCarlo).
The .evi files has a very simple structure and can contain 2 sections :
[EVIDENCE]
where one can specify (soft-like) evidence for each variable
[EVIDENCE]
L 0 1
G 1 0 1
(here L has the value 1, and G can not have the value 1)
[QUERY]
where one can specify for which variable-value one is interested to compute the probability interval
[QUERY]
A 1
H 0 1
I am interested in the probability interval for A=0 and for H=1
2Umin.bif
and 2Umax.bif
are bif files but for credal networks (proba max and proba min). They are used with CredalNets (as in http://webia.lip6.fr/~phw/aGrUM/docs/last/notebooks/14-Models_credalNetworks.ipynb.html):
cn=gum.CredalNet("cn/2Umin.bif","cn/2Umax.bif")
Hi again @yiruiZhao ,
I just remembered this discussion and that I forgot to mention something. So just to be clear: for Credal Networks, there are 2 inference algorithms implemented in aGrUM:
But on the other hand, we propose a binarization (the one proposed by LG2U) that we do not do automatically because it implies a possible important modification of the underlying distribution (approximateBinarization
)
With this binarization, you can use LoopyBelief on any discrete CN...
@phwuill_gitlab Hi there. I updated the personal homebrew repo for v0.20.1. While the test in the Formula seems to work just fine (at least the new version is installed so I guess it compiled and run properly), when I try it in another (previous) project, I get stucked at some linker missing reference such as (on OSX):
Undefined symbols for architecture x86_64:
"__ZN3gum30BinaryJoinTreeConverterDefault7convertERKNS_11CliqueGraphERKNS_9HashTableImmSaISt4pairImmEEEERKNS_3SetImSaImEEE", referenced from:
__ZN3gum15LazyPropagationIfE13createNewJT__Ev in main.cpp.o
Did you ever experience such situation?
Another small issue I do not know if you are aware of: since I have different toolchains and I was trying to figure out the issue, I switched from my g++ homebrewed compiler to the standard OSX one: clang. I noticed that now it ships with openmp (previously you just had an error on CMake find _package) but on compilation the #include <omp.h>
is not working.
find omp.h
in the clang's folder ?
Hi @xavier7179 , this error is quite weird since BinaryJoinTreeConverterDefaut(...) has not been changed for a while ...
Did you recompile your old project ? Do you have several versions of aGrUM installed in different places ?
actually yes, I recompiled everything and since I installed it using homebrew, only one version remains in the system... I also print the full verbose version of the command and everything seems right in terms of libraries... that one is only one of the many part that it does not link properly... it looks the static compilation of the library did not included everything
If the tests are OK, I think that the library is quite complete. The test coverage is not so bad and particularly for the BN's inference and especially for the LazyPropagation which is one of the most used algorithms in aGrUM ...
Can you find a minimal code that causes such linking errors so that we can test it ourselves ?
HowToBuildSimpleBN
against it ...
.dynlib
and the second one (39524344 octests) is against .a
... Both are working correctly
.o
having no symbols is normal : a major part of aGrUM is template (or inlined) and then are compiled with the application and does not create any symbol in the library ...
python act install release aGrUM --static -d INSTALL_PATH
. The formula runs the hash table test to very the installation. In order to check it even further, I downloaded it and check everything by hand, thus I also run the act test
part
I found the 'bug'. If keep the suggested openmp include:
find_package(OpenMP)
if (OPENMP_FOUND)
set (CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${OpenMP_C_FLAGS}")
set (CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${OpenMP_CXX_FLAGS}")
set (CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} ${OpenMP_EXE_LINKER_FLAGS}")
else (OPENMP_FOUND)
message(FATAL "OpenMP not found")
endif(OPENMP_FOUND)
it does not link.