Kalman Filter textbook using Ipython Notebook. This book takes a minimally mathematical approach, focusing on building intuition and experience, not formal proofs. Includes Kalman filters, Extended Kalman filters, unscented filters, and more. Includes exercises with solutions.
rlabbe on master
Update README.rst Adding defin… Merge pull request #267 from ja… (compare)
rlabbe on master
Added test using code from scik… Fixed warning about ragged arra… Removed use of scipy's mahalano… (compare)
rlabbe on master
Added support for lists in pret… (compare)
rlabbe on master
Pass KF into kinematic_kf kine… Made docstrings with latex in t… (compare)
rlabbe on master
Revert "Generalized sigma-point… Merge pull request #272 from rl… (compare)
rlabbe on revert-247-master
Revert "Generalized sigma-point… (compare)
rlabbe on master
Added page numbers and edition … Merge branch 'master' of https:… (compare)
rlabbe on master
included link to readthedocs do… Merge pull request #259 from zo… (compare)
rlabbe on master
Fix requirements.txt syntax Merge pull request #254 from ha… (compare)
rlabbe on master
Fixed repeated word in doc for … Merge pull request #248 from Gj… (compare)
rlabbe on master
first attempt for 1D Ebeigbe si… compute weights of EbeigbeSigma… comments for class Ebeigbesigma… and 6 more (compare)
rlabbe on master
Correct output shapes of sigma_… Merge pull request #244 from an… (compare)
rlabbe on master
Fixed the names of the differen… Made the test for the square ro… Fixed bug in the dot product ar… and 5 more (compare)
rlabbe on master
Enable sequential processing Enable processing vector compon… Add a unit test for sequential … and 1 more (compare)
rlabbe on master
added space after operator Exa… (compare)
rlabbe on master
Issue #420 - Clarify sum is of … (compare)
rlabbe on master
switch to f-strings Some cells… (compare)
rlabbe on master
interactive plots now work corr… merged and resolved conflicts Merge branch 'deadundead-fix-in… and 1 more (compare)
rlabbe on master
Reordered plot for consistency … Merge pull request #415 from er… (compare)
rlabbe on master
Update 11-Extended-Kalman-Filte… Merge pull request #402 from jd… (compare)
import numpy as np
from scipy.stats import multivariate_normal
from numpy import dot, log, exp
import scipy.linalg as la
def gaus_pdf(X, M, S):
DX = (X-M)[0,0]
E = 0.5*np.dot(DX.T, (S/DX))
d = M.shape[0]
E = E + 0.5 * d * log(2*np.pi) + 0.5 * log(la.det(S));
P = exp(-E)
return P
def kf_liklihood(x, P, z, H, R):
IM = np.dot(H, x)
S = np.dot(H, P).dot(H.T) + R
print(gaus_pdf(z, IM, S))
print(multivariate_normal.pdf(z, mean=IM, cov=S))
return multivariate_normal.pdf(z, mean=IM, cov=S)
from scipy.stats import multivariate_normal
def likelihood(x, P, z, H, R):
IM = np.dot(H, x)
S = np.dot(H, P).dot(H.T) + R
return multivariate_normal.pdf(z, mean=IM, cov=S)
When I install the packge filterpy,I got the following error:
C:\WINDOWS\system32>conda install --channel https://conda.anaconda.org/phios filterpy
Fetching package metadata: ......
Solving package specifications: ..........
Error: Unsatisfiable package specifications.
Generating hint:
[ COMPLETE ]|##################################################| 100%
Hint: the following packages conflict with each other:
Use 'conda info filterpy' etc. to see the dependencies for each package.
Note that the following features are enabled:
pip install filterpy
instead; that should work. Recently I tried to get filterpy working with conda; conda install filterpy
might work for windows, but I have reports it doesn't work for mac and linux yet. I need to put more effort on this.
@rlabbe I just started working through the Jupyter notebooks. Your intuitive approach to the subject matter is very refreshing. I do have a question/observation about the material in Chapter 3... I find the discussion of the product vs sum of "Gaussians" a bit confusing. It seems that you are discussing the sum of Gaussian random "variables" and the product of Gaussian probability "distributions". The sum of two independent Gaussian random variables is also Gaussian-distributed. The product of two Gaussian random variables is not, in general, Gaussian-distributed.
Having now made it through Chapter 4... I think the source of the confusion is that, in both cases, we are really talking about operations on Gaussian "distributions" rather than random "variables" . The mathematical operation involved in the "prediction" step is really a convolution, rather than a sum, of Gaussian "distributions", which can be shown to be a Gaussian "distribution" with mean and variance as described in Chapter 3. At least, that's what I think after reading Chapter 4... Looking forward to further enlightenment in the upcoming chapters... :-)
In chapter 6: Multivariate Kalman Filters, in the part
what if 𝑥=𝑥˙Δ𝑡 ? (set F00 to 0, the rest at defaults)
leads to the following plot of P=FPFT:
But I can't get the idea behind it, I mean how is the variance for x (position) reduced to minimum ? On using 𝑥0 = 0, the resulting matrix for P is:
Font metrics not found for font: .:
[^2_ Δ^2 b^2_ Δ^2 ]
Hi @rlabbe I came across the part "Stable Computation of the Posterior Covariance" and couldn't understand how the Joseph-equation-derived-covariance mitigates a non-optimal K (Kalman gain) ? I see that to be possible only when a different method is used to compute K.
Can you please help ?