from aesara.tensor.special import softmax
?
from aesara.tensor.math import softmax
from aesara.tensor import softmax
or more typically import aesara.tensor as at; at.softmax
special.py
. Makes comparisons with what’s available in scipy easier
I’m reading the code for linkers and I have 2 questions:
1- Is there any reason why the file structure is different for numba and jax linkers?
2- Is there any way to print the generated code without compiling for debugging ? That was very useful with MCX.
Probably because Jax offers much more flexibility than Numba.
For instance take the case of Scan. Jax has an inbuilt scan like functionality, but in case of Numba we have to create the loops manually.
We can use code generation approach in Jax too, but i think we're yet to run into logic that cannot be implemented in Jax.
Scan
Op
s
Op
s
Op.perform
implementations will ultimately be able to serve as the only necessary Op
implementation code
Op.perform
list
s serve as "pointers" to memory that Theano manages manually
brandonwillard: I tried to follow your blogpost on DLMs, but I didn't succeed in getting auto_updates to work with this minimal graph:
rng = aesara.shared(np.random.default_rng(), borrow=True)
rng.tag.is_rng = True
rng.default_update = rng
def step(v_tm1, rng):
v_t = at.random.normal(v_tm1, 1.0, name='v', rng=rng)
return v_t
v, updates = aesara.scan(
fn=step,
outputs_info=[np.array(0.0)],
non_sequences=[rng],
n_steps=5,
strict=True,
)
v_draws = aesara.function([], v, updates=updates)()
assert len(np.unique(np.diff(v_draws))) > 1 # All values have the same offset
Any thing obvious I am missing?