Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 09:05

    junpenglao on master

    Update developer guide (#3632) … (compare)

  • 09:05
    junpenglao closed #3632
  • 03:01
    snazzybloke commented #3633
  • 03:01
    snazzybloke edited #3633
  • 03:00
    snazzybloke opened #3633
  • 02:59
    snazzybloke commented #3496
  • 02:59
    snazzybloke commented #3496
  • Sep 22 14:05
    junpenglao opened #3632
  • Sep 19 18:48
    maxliving opened #3631
  • Sep 19 08:24

    junpenglao on master

    Remove redundant line in `_Quad… (compare)

  • Sep 19 08:24
    junpenglao closed #3630
  • Sep 19 05:48
    junpenglao synchronize #3630
  • Sep 19 05:31
    junpenglao opened #3630
  • Sep 16 23:03
    ColCarroll commented #3628
  • Sep 16 22:57

    ColCarroll on master

    Reset iter_count in hmc (#3628)… (compare)

  • Sep 16 22:57
    ColCarroll closed #3628
  • Sep 16 22:57
    ColCarroll closed #3627
  • Sep 16 22:56
    rpgoldman opened #3629
  • Sep 16 20:56
    junpenglao edited #3628
  • Sep 16 20:56
    junpenglao opened #3628
Maxim Kochurov
@ferrine
so I can see where is this error
Nikos Koudounas
@aplamhden

Yeah i saw the term Traceback and i got what u mean. Here is the error ---------------------------------------------------------------------------
TypeError Traceback (most recent call last)

<ipython-input-51-67616625ebc1> in <module>()
1 ann_input.set_value(X_test)
2 ann_output.set_value(Y_test)
----> 3 ppc = pm.sample_ppc(trace, model=basic_model, samples=500, progressbar=False)
4
5 # Use probability of > 0.5 to assume prediction of class 1

C:\Users\Nikos\Documents\Lasagne\python-3.4.4.amd64\lib\site-packages\pymc3\sampling.py in sample_ppc(trace, samples, model, vars, size, random_seed, progressbar)
526 for var in vars:
527 ppc[var.name].append(var.distribution.random(point=param,
--> 528 size=size))
529
530 return {k: np.asarray(v) for k, v in ppc.items()}

C:\Users\Nikos\Documents\Lasagne\python-3.4.4.amd64\lib\site-packages\pymc3\distributions\continuous.py in random(self, point, size, repeat)
219 def random(self, point=None, size=None, repeat=None):
220 mu, tau, _ = draw_values([self.mu, self.tau, self.sd],
--> 221 point=point)
222 return generate_samples(stats.norm.rvs, loc=mu, scale=tau**-0.5,
223 dist_shape=self.shape,

C:\Users\Nikos\Documents\Lasagne\python-3.4.4.amd64\lib\site-packages\pymc3\distributions\distribution.py in draw_values(params, point)
183 if not isinstance(node, (tt.sharedvar.TensorSharedVariable,
184 tt.TensorConstant)):
--> 185 givens[name] = (node, drawvalue(node, point=point))
186 values = [None for
in params]
187 for i, param in enumerate(params):

C:\Users\Nikos\Documents\Lasagne\python-3.4.4.amd64\lib\site-packages\pymc3\distributions\distribution.py in draw_value(param, point, givens)
251 except:
252 shape = param.shape
--> 253 if len(shape) == 0 and len(value) == 1:
254 value = value[0]
255 return value

TypeError: object of type 'TensorVariable' has no len()

Maxim Kochurov
@ferrine
Seems that it’s not ADVI failuture
Something is wrong in draw values
Do you reproduce the error with non-advi trace?
Nikos Koudounas
@aplamhden
i had the same error while i was trying to run the code from the original ipynb https://github.com/pymc-devs/pymc3/blob/master/docs/source/notebooks/bayesian_neural_network_opvi-advi.ipynb Yes i had the same error with NUTS and Metropolis
Maxim Kochurov
@ferrine
Try passing include_transformed=True to approx.sample
I remember some changes there
Nikos Koudounas
@aplamhden
The same error again.
Maxim Kochurov
@ferrine
So opening an issue is the only way now.
We’ll try to solve the problem before release
If it is possible, could you please provide minimal failing example?
Nikos Koudounas
@aplamhden
Sure, i am doing it atm.
Nikos Koudounas
@aplamhden
I opened an issue. Thnx for help @ferrine
Rémi Louf
@rlouf
Hi, I am new to PyMC3 and find the project amazing! I was wondering if there were ways someone like me could start contributing to the project? I would like to become more fluent in Python and learn more about probabilistic programming at the same time :smile:
Junpeng Lao
@junpenglao
hi @rlouf , welcome! I think a great starting point is to run the notebooks in https://github.com/pymc-devs/pymc3/tree/master/docs/source/notebooks and try to fix/report the errors - we always need hands to maintain the notebooks as we are doing it mostly by hand currently. And our API changes quite a bit in the pass few months so something might be broken and we have not realized yet.
I think this is a great way also to learn python and bayesian statistics at the same time.
There are also notebooks need some explanations and background information, for example https://github.com/pymc-devs/pymc3/blob/master/docs/source/notebooks/cox_model.ipynb which would be a great starting point for contribution as well.
Rémi Louf
@rlouf
Hi @junpenglao ! Thank you for the pointers, it indeed looks like an interesting way to get to know PyMC3 and learn a bit more about bayesian statistics. I'll come back when my homework is done :) I might also add a comprehensive notebook on bayesian A/B testing (with hierarchical models and all)--unless you feel there is no need for it.
Junpeng Lao
@junpenglao
It would be a great addition! ;-)
We just set up our discourse forum, for longer discussion please move to https://discourse.pymc.io/
Thomas Wiecki
@twiecki
:+1:
samuelklee
@samuelklee
Hi all! I've been using the ADVI in PyMC3 to fit a Poisson latent Gaussian model with ARD. In testing on simulated data, I've gotten good results with the old ADVI interface (in that the number of simulated relevant components is correctly recovered), but switching over to the new ADVI interface sometimes gives me inconsistent results. Should I expect these results to be roughly equivalent if I set obj_optimizer=pm.adagrad_window(learning_rate), or are there other parameters I need to set appropriately?
Maxim Kochurov
@ferrine
Hi
Maxim Kochurov
@ferrine
Could you please post the issue to our discourse?
I would recommend checking loss convergence first. Did it converge?
samuelklee
@samuelklee
Yes, convergence looks reasonable. I'll try to put together a discourse post later today. Just curious if I should expect the underlying ADVI implementation to have changed, as I haven't looked at the relevant code in detail yet.
Junpeng Lao
@junpenglao
hey guys, we are thinking about closing the Gitter channel and move everything to Discourse, as it is easier for discussion and search. What do you think?
Osvaldo Martin
@aloctavodia
@junpenglao I think the lack of response for the last two days is a very clear answer ;-)
Great! I see you have already done a PR
Junpeng Lao
@junpenglao
The gitter channel will remain open for light conversation ;-)
Paddy Horan
@paddyhoran
Does anyone have an example of multidimensional input in Gaussian process regression? So x shape = (num records, p) and y shape = (num records) where p > 1. Thanks
Bill Engels
@bwengals
Set the input dimension arg: cov = pm.gp.cov.Matern52(p, lengthscales)
Paddy Horan
@paddyhoran
Thanks Bill. I did that, I thought I had everything setup correctly but when I pass in x as observed I get an error. A shape mismatch error, input[0].shape[1] = Num records, but input[1].shape[1] = p. That's why I was looking for an example to study... if I have the model setup correctly I should be able to pass the 2d ndarray directly to observed right? I was thinking maybe it was a theano issue that.
My example works for the 1d case but fails for p>1. So I'm wondering if I need to wrap the 2d x in a theano object maybe?
Bill Engels
@bwengals
Oh i see, sorry about that. the gp library is will be changing a fair bit soon, so these things should be smoothed out in the future
try this:
gp = pm.gp.GP("gp", cov_func=cov, X=X, sigma=sn, observed=y)
setting X as an argument
also, were trying to move questions like this over to https://discourse.pymc.io/ . You'll probably get a quicker response there next time :)
Paddy Horan
@paddyhoran
I actually just joined discourse. I'll get a concrete example and post there. Thanks very much Bill, much appreciated.
Bill Engels
@bwengals
thanks! see u over there
Dani Arribas-Bel
@darribas
hello, I have a hierarchical model which was working fine in PyMC 3.0 and, when I upgraded to 3.1 today, I get the FloatingPointError: NaN occurred in optimization. error on the first ADVI iteration. All the parameters I set for the model (priors, etc.) have not changed, I've only changed the API to access ADVI to pm.fit. Am I missing something obvious with the upgrade?
Short answer is there might be some hidden problem in your initial model, if the above posts did not solve your problem, please open a discussion on discourse with your code and (simulated) data
Dani Arribas-Bel
@darribas
:+1: Excellent! thank you very much @junpenglao !
dlovell
@tsdlovell
is there a Contributor License Agreement for pymc3?
Majid alDosari
@majidaldo
to revive the tensorflow discussion, the main advantage to using tensorflow is proper multiple gpu support
oh is discourse being used now?
Thomas Wiecki
@twiecki
yes, this channel isn't really watched anymore, everyone move to discourse