Kalman Filter textbook using Ipython Notebook. This book takes a minimally mathematical approach, focusing on building intuition and experience, not formal proofs. Includes Kalman filters, Extended Kalman filters, unscented filters, and more. Includes exercises with solutions.
rlabbe on master
added space after operator Exa… (compare)
rlabbe on master
Issue #420 - Clarify sum is of … (compare)
rlabbe on master
switch to f-strings Some cells… (compare)
rlabbe on master
interactive plots now work corr… merged and resolved conflicts Merge branch 'deadundead-fix-in… and 1 more (compare)
rlabbe on master
Reordered plot for consistency … Merge pull request #415 from er… (compare)
rlabbe on master
Update 11-Extended-Kalman-Filte… Merge pull request #402 from jd… (compare)
rlabbe on master
Update 11-Extended-Kalman-Filte… Merge pull request #362 from Ni… (compare)
rlabbe on master
Proof reading Chapter 5 Fixed … Proof reading Chapter 5 Remove… Proof reading Chapter 9 No cha… and 7 more (compare)
rlabbe on master
Elaborate how c_0=x_0 Merge pull request #403 from Ac… (compare)
rlabbe on master
Update README.md Removed the l… Merge pull request #384 from nb… (compare)
rlabbe on master
Minor typo Merge pull request #419 from bi… (compare)
rlabbe on master
Update README.md Spelling erro… Merge pull request #409 from ac… (compare)
rlabbe on master
Fix minor typo Merge pull request #388 from ca… (compare)
rlabbe on master
Fix typo in markdown Typo in c… Merge pull request #387 from si… (compare)
rlabbe on master
docs: Fix a few typos There ar… Merge pull request #385 from ti… (compare)
rlabbe on master
:pencil: Fixed duplicated of of… :pencil: Fixed duplicated of of… Merge pull request #375 from Gj… (compare)
rlabbe on master
Add the missing letter <T> on t… Fix a typo Merge branch 'rlabbe:master' in… and 1 more (compare)
rlabbe on master
fix: small error Merge pull request #374 from 0x… (compare)
rlabbe on master
add plural s typo, wording fix typos and 1 more (compare)
rlabbe on master
Fix typos Merge pull request #368 from dr… (compare)
@rlabbe I just started working through the Jupyter notebooks. Your intuitive approach to the subject matter is very refreshing. I do have a question/observation about the material in Chapter 3... I find the discussion of the product vs sum of "Gaussians" a bit confusing. It seems that you are discussing the sum of Gaussian random "variables" and the product of Gaussian probability "distributions". The sum of two independent Gaussian random variables is also Gaussian-distributed. The product of two Gaussian random variables is not, in general, Gaussian-distributed.
Having now made it through Chapter 4... I think the source of the confusion is that, in both cases, we are really talking about operations on Gaussian "distributions" rather than random "variables" . The mathematical operation involved in the "prediction" step is really a convolution, rather than a sum, of Gaussian "distributions", which can be shown to be a Gaussian "distribution" with mean and variance as described in Chapter 3. At least, that's what I think after reading Chapter 4... Looking forward to further enlightenment in the upcoming chapters... :-)
In chapter 6: Multivariate Kalman Filters, in the part
what if 𝑥=𝑥˙Δ𝑡 ? (set F00 to 0, the rest at defaults)
leads to the following plot of $𝐏=𝐅𝐏𝐅^𝖳$:
But I can't get the idea behind it, I mean how is the variance for x (position) reduced to minimum ? On using 𝑥0 = 0, the resulting matrix for P is:
Font metrics not found for font: .:
[��^2_�� Δ��^2 b��^2_�� Δ��^2 ]
Hi @rlabbe I came across the part "Stable Computation of the Posterior Covariance" and couldn't understand how the Joseph-equation-derived-covariance mitigates a non-optimal K (Kalman gain) ? I see that to be possible only when a different method is used to compute K.
Can you please help ?