Kalman Filter textbook using Ipython Notebook. This book takes a minimally mathematical approach, focusing on building intuition and experience, not formal proofs. Includes Kalman filters, Extended Kalman filters, unscented filters, and more. Includes exercises with solutions.
rlabbe on master
gitHub #223 Improved docstring … Fix typo 'transistion' -> 'tran… docs: rts_smoother: Fix wrong r… and 3 more (compare)
rlabbe on master
gitHub #233 Improved docstring … Merge branch 'master' of https:… (compare)
rlabbe on master
Fix misleading comment Hi! … Merge pull request #219 from Ph… (compare)
rlabbe on master
Fix test by relaxing near_eq (f… Merge pull request #217 from pg… (compare)
rlabbe on master
docs: rts_smoother: Fix wrong r… Merge pull request #216 from nh… (compare)
rlabbe on master
Fix typo 'transistion' -> 'tran… Merge pull request #213 from nh… (compare)
rlabbe on master
Fixed some typos Merge pull request #358 from Co… (compare)
rlabbe on master
Updated animations I haven't r… (compare)
rlabbe on master
Updated for Python 3.6 Updated for Python 3.6 Updated for Python 3.6 and 15 more (compare)
rlabbe on v2.0
Updated for Python 3.6 Updated for Python 3.6 Updated for Python 3.6 and 15 more (compare)
rlabbe on v1.1
rlabbe on master
Issue #208 assert comment name… (compare)
rlabbe on master
Added comments for how to curre… (compare)
rlabbe on master
Issue #339 Simplied code for in… (compare)
rlabbe on master
Fixed docstring for gaussian() … (compare)
rlabbe on master
Issue #332 Fixed terrible word… (compare)
rlabbe on master
Issue #338 Fixed error in usin… (compare)
rlabbe on master
Issue #343 Measurements were b… (compare)
rlabbe on master
Issue #352 axis labels were ha… (compare)
rlabbe on master
Fix for #377 Somehow this didn… (compare)
%matplotlib inline
to %matplotlib notebook
and the figure became interactive! I could rotate the PDF to see more angles.
plot_estimate_chart_3()
, the residual line is black and is hard to see. I updated my local copy to make it green, though something brighter like magenta could be better. Change line 107 of code/gh_internal.py
:ax.annotate('', xy=[1,159], xytext=[1,164.2], arrowprops=dict(arrowstyle='-', ec='g', lw=1, shrinkA=8, shrinkB=8))
Hi. Those are good additions for the library. Here's the current status:
Missing data is handled by setting z=None. If using batch_filter, you might call it with kf.batch_filter(zs=[1., 2., 3., None, 5.]). That is probably not 'canonical' python behavior, and I will add it to the issues.
I am working on log-likelihood, and metrics like NEES, NIS, etc for the next release of FilterPy.
I do not currently have an SVD filter. It is on the backlog.
The Kalman filter class uses the standard linear Kalman filter equations; this makes it more pedagogical in nature, though I have used it plenty of times in less demanding situations. The only concession I made to real world engineering is in the computation of P - the published (I-KH)P equation is unstable.
A square root filter is implemented by the class SquareRootKalmanFilter, in the filterpy.kalman module. Read the documentation carefully - this is more a reference implementation and i have not used it in production. Brown suggests that square root filters are no longer needed with modern hardware unless P is going to vary by 20 orders of magnitude. His reasoning seems strong, but I do not have empirical evidence to back that up.
To round out the descriptions, there is also a fading memory and information filter implemented for the linear filters. I have an EKF and UKF, but not with the square root variants.
If you want to compute the log-likelihood yourself you can. This link gives the equation for the computation: http://www.econ.umn.edu/~karib003/help/kalman_example1.htm. Their 'C_t' can be accessed with 'kf.S' in my code after calling update().
import numpy as np
from scipy.stats import multivariate_normal
from numpy import dot, log, exp
import scipy.linalg as la
def gaus_pdf(X, M, S):
DX = (X-M)[0,0]
E = 0.5*np.dot(DX.T, (S/DX))
d = M.shape[0]
E = E + 0.5 * d * log(2*np.pi) + 0.5 * log(la.det(S));
P = exp(-E)
return P
def kf_liklihood(x, P, z, H, R):
IM = np.dot(H, x)
S = np.dot(H, P).dot(H.T) + R
print(gaus_pdf(z, IM, S))
print(multivariate_normal.pdf(z, mean=IM, cov=S))
return multivariate_normal.pdf(z, mean=IM, cov=S)
from scipy.stats import multivariate_normal
def likelihood(x, P, z, H, R):
IM = np.dot(H, x)
S = np.dot(H, P).dot(H.T) + R
return multivariate_normal.pdf(z, mean=IM, cov=S)
When I install the packge filterpy,I got the following error:
C:\WINDOWS\system32>conda install --channel https://conda.anaconda.org/phios filterpy
Fetching package metadata: ......
Solving package specifications: ..........
Error: Unsatisfiable package specifications.
Generating hint:
[ COMPLETE ]|##################################################| 100%
Hint: the following packages conflict with each other:
Use 'conda info filterpy' etc. to see the dependencies for each package.
Note that the following features are enabled:
pip install filterpy
instead; that should work. Recently I tried to get filterpy working with conda; conda install filterpy
might work for windows, but I have reports it doesn't work for mac and linux yet. I need to put more effort on this.
@rlabbe I just started working through the Jupyter notebooks. Your intuitive approach to the subject matter is very refreshing. I do have a question/observation about the material in Chapter 3... I find the discussion of the product vs sum of "Gaussians" a bit confusing. It seems that you are discussing the sum of Gaussian random "variables" and the product of Gaussian probability "distributions". The sum of two independent Gaussian random variables is also Gaussian-distributed. The product of two Gaussian random variables is not, in general, Gaussian-distributed.
Having now made it through Chapter 4... I think the source of the confusion is that, in both cases, we are really talking about operations on Gaussian "distributions" rather than random "variables" . The mathematical operation involved in the "prediction" step is really a convolution, rather than a sum, of Gaussian "distributions", which can be shown to be a Gaussian "distribution" with mean and variance as described in Chapter 3. At least, that's what I think after reading Chapter 4... Looking forward to further enlightenment in the upcoming chapters... :-)