These are chat archives for rlabbe/Kalman-and-Bayesian-Filters-in-Python

30th
Jul 2015
Roger Labbe
@rlabbe
Jul 30 2015 15:08

Hi. Those are good additions for the library. Here's the current status:

Missing data is handled by setting z=None. If using batch_filter, you might call it with kf.batch_filter(zs=[1., 2., 3., None, 5.]). That is probably not 'canonical' python behavior, and I will add it to the issues.

I am working on log-likelihood, and metrics like NEES, NIS, etc for the next release of FilterPy.

I do not currently have an SVD filter. It is on the backlog.

The Kalman filter class uses the standard linear Kalman filter equations; this makes it more pedagogical in nature, though I have used it plenty of times in less demanding situations. The only concession I made to real world engineering is in the computation of P - the published (I-KH)P equation is unstable.

A square root filter is implemented by the class SquareRootKalmanFilter, in the filterpy.kalman module. Read the documentation carefully - this is more a reference implementation and i have not used it in production. Brown suggests that square root filters are no longer needed with modern hardware unless P is going to vary by 20 orders of magnitude. His reasoning seems strong, but I do not have empirical evidence to back that up.

To round out the descriptions, there is also a fading memory and information filter implemented for the linear filters. I have an EKF and UKF, but not with the square root variants.

If you want to compute the log-likelihood yourself you can. This link gives the equation for the computation: http://www.econ.umn.edu/~karib003/help/kalman_example1.htm. Their 'C_t' can be accessed with 'kf.S' in my code after calling update().

Roger Labbe
@rlabbe
Jul 30 2015 15:18
Here is some code for likelihood. Haven't really tested it
import numpy as np
from scipy.stats import multivariate_normal
from numpy import dot, log, exp
import scipy.linalg as la
def gaus_pdf(X, M, S):
    DX = (X-M)[0,0]
    E = 0.5*np.dot(DX.T, (S/DX))
    d = M.shape[0]
    E = E + 0.5 * d * log(2*np.pi) + 0.5 * log(la.det(S));
    P = exp(-E)
    return P


def kf_liklihood(x, P, z, H, R):
    IM = np.dot(H, x)
    S = np.dot(H, P).dot(H.T) + R
    print(gaus_pdf(z, IM, S))
    print(multivariate_normal.pdf(z, mean=IM, cov=S))
    return multivariate_normal.pdf(z, mean=IM, cov=S)
Here is code for likelihood that I haven't really tested
from scipy.stats import multivariate_normal
def likelihood(x, P, z, H, R):
    IM = np.dot(H, x)
    S = np.dot(H, P).dot(H.T) + R
    return multivariate_normal.pdf(z, mean=IM, cov=S)