by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
Rodrigo Gadea
@rodrigogadea_twitter
(although I'm seeing now that Z_eo is pointing to alpha, but nevermind...)
Jaakko Luttinen
@jluttine
ok, nice!
Rodrigo Gadea
@rodrigogadea_twitter
Iteration 1: loglike=-1.098049e+03 (0.003 seconds)
[array([  0.00000000e+000,   5.14238102e-187,   0.00000000e+000,
         0.00000000e+000,   0.00000000e+000,   1.53892421e-221,
         0.00000000e+000,   8.68501324e-151,   0.00000000e+000,
         1.00000000e+000])]
Now I have to see how to wrap it
Is there a way of sending the new nodes to the existing Q?
I mean, without redefining it?
Jaakko Luttinen
@jluttine
you don't even have to use VB object. it's just a convenience class. you can just use nodes directly, like Z.update(). you can create a new VB object from old one like: Q_new = VB(new_node1, new_node2, *Q.model)
you can also try adding a node to existing Q like: Q.model.append(new_node) but that's not recommended as you bypass some checkings so you may accidentally do something wrong. but in principle that works, perhaps
Rodrigo Gadea
@rodrigogadea_twitter
Oh, that's cool :)
Rodrigo Gadea
@rodrigogadea_twitter
thanks a lot for the help and your time
Jaakko Luttinen
@jluttine
no problem!
eykiriku
@eykiriku
Assume I have an observation node that instead of being Categorical like the activity node here (http://bayespy.org/examples/hmm.html), it represents the degree of activation of fuzzy rules: 80% walk and 20% shop are active. What node model should I use instead of Categorical to allow inference with observations that result from fuzzy sets?
Jaakko Luttinen
@jluttine
do you mean that you're knowledge about whether walk or shop is p(walk)=0.8 and p(shop)=0.2, or what is the meaning of those percentages?
eykiriku
@eykiriku
Capture.PNG
say this graph is yawning frequency per minute, my input is approx. 2.5 yawning per minute which will have a 80% activation of the high rule and 20% of the normal rule
in short, yes you were right
Jaakko Luttinen
@jluttine
if you observe probabilities, maybe one can use beta or dirichlet node. if you have noisy observations of a categorical variable, maybe you can add a noisy observed categorical variable that has the true (unknown) state as a parent.
s/probabilities/probabilities or some other variables that are positive and sum to one/
eykiriku
@eykiriku
true, thanks
eykiriku
@eykiriku
my problem was more related to bayespy, like assume i choose a dirichlet observation node Y. Y has 2 states, Normal and High. As above, any evidence will belong to the array [0, 5] and the fuzzy rules may lead to an output saying 80% activation of high and 20%.... My doubt is how to define the variable 'activity', the one to be observed for inference in Y.observe(activity), using this fuzzy ouput
Rodrigo Gadea
@rodrigogadea_twitter
@jluttine , Hi! Maybe you know why I can't fix the seed of the random number generator with np.random.seed()? Is it not the seed that must fix? I'm trying to obtain the same results in likelihood in different runs without success... :S
Rodrigo Gadea
@rodrigogadea_twitter
it seems silly, but I can't find a way in django tests to produce the same inference results, because neither np.random.seed or random.seed seems to work... :S
Rodrigo Gadea
@rodrigogadea_twitter
seems that PYTHONHASHSEED=number makes it work
Jaakko Luttinen
@jluttine
rodrigogadea_twitter: ok, interesting
Marvie Demit
@Don_Chili_twitter
hi @jluttine I am starting to use bayespy and I want to make a wrapper function fitting gaussian mixture for model selection.
def fitted_gaussian(N, n_krnl, D, covariance = 'full'):
# Input:
# N = Number of data vectors 
# D = Dimensionality
# n_krnl = Number of kernels
# Prior
P = nds.Dirichlet(1e-5*np.ones(n_krnl), name='P')
# N n_krnl-dimensional cluster (for the data)
I = nds.Categorical(P, plates=(N,), name='I')
# n_krnl D-dimensional components means
if covariance == 'full':
    # n_krnl D-dim component covariance
    mu = nds.Gaussian(np.zeros(D), 1e-5*np.identity(D), plates = (n_krnl,), name = 'mu')
    Lambda = nds.Wishart(D, 1e-5*np.identity(D), plates = (n_krnl,), name = 'Lambda')
    Y = nds.Mixture(I, nds.Gaussian, mu, Lambda, plates = (N,),  name = 'Y')
else:
    print('diagonal')
    # inverse variances
    mu = nds.GaussianARD(np.zeros(D), 1e-5*np.identity(D), shape = (D,), plates = (n_krnl,), name = 'mu')
    Lambda = nds.Gamma(1e-3, 1e-3, plates = (n_krnl, D), name = 'Lambda')
    Y = nds.Mixture(I, nds.GaussianARD, mu, Lambda, plates = (N,),  name = 'Y')
I.initialize_from_random()
return VB(Y, mu, Lambda, I, P)
the problem is that I get this when I apply it with the diagonal; ValueError: The plates (2,) of the parents are not broadcastable to the given plates (10,).
How can I fix this?
Jaakko Luttinen
@jluttine
Don_Chili_twitter: is n_krnl=2 and N=10?
Jaakko Luttinen
@jluttine
Don_Chili_twitter: can't really test now but i guess the issue is that GaussianARD makes scalar gaussian variables by default. in principle, you should be able to fix it by giving ndim=1 to Mixture but i think it doesn't work atm. alternatively, you can create a diagonal matrix from Lambda by Lambda.as_diagonal_wishart() and using nds.Gaussian instead of nds.GaussianARD in
Mixture
that is: Y = nds.Mixture(I, nds.Gaussian, mu, Lambda.as_diagonal_wishart(), plates=(N,), name='Y')
couldn't test though
Marvie Demit
@Don_Chili_twitter
Hi @jluttine ! Thank you for your very fast response: n_krnl should be 10. I already tried your recommendation but there is something weird about the plates, although I already tried to make the parent plates the same dimension as the given plates but it never match......ValueError: The plates (2,) of the parents are not broadcastable to the given plates (10,).
Jaakko Luttinen
@jluttine
ok, i'll need to check that with actual python. maybe tomorrow
Rodrigo Gadea
@rodrigogadea_twitter
@jluttine are you there?
Jaakko Luttinen
@jluttine
yep
Rodrigo Gadea
@rodrigogadea_twitter
:)
Hi Jaako, can I ask you for a quick review on a text?
Jaakko Luttinen
@jluttine
you can always ask :)
Rodrigo Gadea
@rodrigogadea_twitter
I'm finishing the documentation of the first release of django-ai and I wanted your opinion - if I am missing something
Jaakko Luttinen
@jluttine
ok
Rodrigo Gadea
@rodrigogadea_twitter
Jaakko Luttinen
@jluttine
student t distribution is possible with bayespy. one just needs to construct it as a combination of gamma+gaussian. so one could implement mixture of student t with bayespy
Rodrigo Gadea
@rodrigogadea_twitter
ah, cool, how would that be?
section 2
Jaakko Luttinen
@jluttine
yep
vb learning can converge to bad local minima. better initialization or better learning algorithm may help. for instance, deterministic annealing. also, sometimes changing the model might help. for instance, if one factors q(mu)q(Lambda), then re-formulating so that one gets q(mu,Lambda) might improve the posterior accuracy and the learning