Hi there! I hope you and yours are well.
New person to Gitter & PySB here. I have some questions after going through the tutorial at https://pysb.readthedocs.io/en/latest/tutorial.html.
1) When I run the cmd visualization command, I don't get errors, but neither do I get any visual output. Would you know what could be causing this or things I should try to debug? I can post the screenshots (how do I attach something?).
2) Is there a spot for me to learn more about the "Model calibration" and "Modules" sections? They are empty at the moment.
3) I saw multiple typos in the tutorial. Can I send you somehow the locations of the typos? Sorry about that; spelling is one of those small things that bugs me.
Thank you for your time and I look forward to hearing from you.
Hi @alubbock ,
Sorry for my delay; life is wild.
1) I am on Windows 10 64 bit. For now I will work around the visualization aspect until I explore that deeper in my project. Thank you for the tip on using imgur!
2) I'm checking out PyDREAM as well! Thank you for that insight.
3) Yes, I think I could do the edit and pull request! That sounds like the best option.
When I have more thoughts/ questions, I will let you know! Thank you for your help so far.
Dear forum members,
I am a PyDream user. I have a quick question. Is there a way to define different form of priors for different parameters?
Currently I am using uniform priors for all my parameters. I am using following line of code:
parameters_to_sample = SampledParam(uniform, loc=lower_limits, scale=scale)
sampled_parameter_names = [parameters_to_sample]
Now, I want to give uniform priors to a few parameters and gaussians to the rest. What is the best way to go about it?
You can use any of the scipy distributions (https://docs.scipy.org/doc/scipy/reference/stats.html) as a prior in pydream.
To use a uniform and a gaussian distribution you can do something like this:
from scipy.stats import norm, uniform par1 = SampledParam(uniform, loc=lower_limits, scale=scale) par2=SampledParam(norm, loc=mean, scale=std) sampled_parameters = [par1, par2
Hope this is helpful
Hi @pietromicheli, if your parameters change over time as a function of the concentration of one or multiples species in your model, you can create an Expression and pass it as rules' rates. For doing something like this, you can check this example:
If you just want to pass a list parameter values to be used at different time points, I am not aware that there is function like that in pysb. However, you could simulate the first time points with the parameters that you want and then used the simulated results as the initial conditions of the next simulation that has different parameter values. For an example like that, take a look at this function:
@alubbock might have some better ideas :)
Hi @ortega2247 and @lh64, thank you for the answers! :)
@lh64 you're definitely right, I apologize. I'm trying to model a post-synaptic neuron activity:
First, I simulate the gating of post-synaptic ion channels in a Pysb model. Then I use the trajectory of the open channels to calculate, for each time point, the quantity of Calcium ions that flow in the time unit. This post simulation calculation will create an array (of length equal to the time span array used for the first pysb simulation) that basically describes the time course of the post-synaptic calcium influx. What I'm trying to do now is to pass this array to a second Pysb model which contains some Calcium-dependent reactions. The goal here is to use the values of my array (one for each time step) to drive a synthesis-like reaction for a Calcium monomer that can be used by all the Calcium-dependent reactions. I really hope it's clear enough! :)
Thanks a lot @ortega2247 , your function seems super cool for creating a kind of discrete event during the simulation, but in this case I want that my parameter continuously change at each time step :)
model.rulesobject by excluding the Rule(s) you don't want, using
model.reset_equationsto clear out the reactions, species, etc. created by BNG, and then regenerating the network. Here's a small example script I put together doing that:
param_valuesarguments to the
runmethod of the simulator. I'm attaching an example here. I defined two simple rules:
X() >> Noneand
A() + X() >> X(). In the code, I first run a simulation of the full system. Then, I run two sequential simulations, the first with only the
X() >> Nonerule and the second with only the
A() + X() >> X()rule, where the concentration of
X()from the first simulation is fed into the simulator for the second simulation using the
ScipyOdeSimulator.run. To be clear, the second "simulation" is actually a series of simulations from output time point to output time point, with the final concentration of
A()from the previous step and the corresponding value of
X()passed into the simulator at each step. I have it loop over three different numbers of output points (10, 100, and 1000) to show how the output of the sequential simulation approaches the true result from the full simulation as the number of output points increases. I'm attaching those plots here as well. Hopefully this helps. If you have any further questions just let us know.
@lh64 Thank you for the answer!
Actually my first approach to this problem was to iteratively run one simulation for each time point just as you suggested, but I wasn't very happy with the computational cost of a such sequential simulation (expecially for a high number of time points). However looking at your code I noticed that (stupidly) I created at each iteration a new simulator object (via ScipyOdeSimulator() function). So it turned out that this unnecessary step tripled the cost of each iteration, compared to the cost that runinning the simulations (via .run() method) using always the same simulator object at each iteration would require. The overall cost is still quite high obviously, but for reasonably small number of time points it is tollerable.
Your answer and your code has been really helpfull, thanks a lot! :)
pip show pydreamin the terminal. The version should be 2.0.0