==effectively does what
g_1 == g_2a little expensive when they're both large graphs that are very similar except in the "leaf" nodes/inputs
dicts all the time, it would surely have an effect
This seems like a whole other problem in itself.
As of now, I was looking for ways to have a copy of a nested structure (for instance a dict with list of TensorVariables), I wanted to track changes within such structures, by comparing old to new modified structures. But ran into this issue when I deepcopied the nested structure.
mathseems a bit cluttered. There is also
import aesara import aesara.tensor as at rv, updates = aesara.scan( fn=lambda : at.random.normal(0, 1), n_steps=5, ) print(rv.eval()) # [-1.5294442 -1.5294442 -1.5294442 -1.5294442 -1.5294442] print(rv.eval()) # [-1.5294442 -1.5294442 -1.5294442 -1.5294442 -1.5294442]
numpyand start working my way through the list. Could do that with
math, but this can get unwieldy.
jax.scipy) but it would make a lot of sense
from aesara.tensor.special import softmax?
from aesara.tensor.math import softmax
from aesara.tensor import softmaxor more typically
import aesara.tensor as at; at.softmax
special.py. Makes comparisons with what’s available in scipy easier
I’m reading the code for linkers and I have 2 questions:
1- Is there any reason why the file structure is different for numba and jax linkers?
2- Is there any way to print the generated code without compiling for debugging ? That was very useful with MCX.
Probably because Jax offers much more flexibility than Numba.
For instance take the case of Scan. Jax has an inbuilt scan like functionality, but in case of Numba we have to create the loops manually.
We can use code generation approach in Jax too, but i think we're yet to run into logic that cannot be implemented in Jax.