Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Feb 08 17:56

    rlabbe on master

    gitHub #223 Improved docstring … Fix typo 'transistion' -> 'tran… docs: rts_smoother: Fix wrong r… and 3 more (compare)

  • Feb 08 17:51

    rlabbe on master

    gitHub #233 Improved docstring … Merge branch 'master' of https:… (compare)

  • Jan 04 03:55

    rlabbe on master

    Fix misleading comment Hi! … Merge pull request #219 from Ph… (compare)

  • Nov 20 2020 18:51

    rlabbe on master

    Fix test by relaxing near_eq (f… Merge pull request #217 from pg… (compare)

  • Nov 20 2020 18:50

    rlabbe on master

    docs: rts_smoother: Fix wrong r… Merge pull request #216 from nh… (compare)

  • Nov 20 2020 18:50

    rlabbe on master

    Fix typo 'transistion' -> 'tran… Merge pull request #213 from nh… (compare)

  • Oct 19 2020 18:33

    rlabbe on master

    Fixed some typos Merge pull request #358 from Co… (compare)

  • Oct 14 2020 01:00

    rlabbe on master

    Updated animations I haven't r… (compare)

  • Oct 13 2020 23:39

    rlabbe on master

    Updated for Python 3.6 Updated for Python 3.6 Updated for Python 3.6 and 15 more (compare)

  • Oct 13 2020 23:39

    rlabbe on v2.0

    Updated for Python 3.6 Updated for Python 3.6 Updated for Python 3.6 and 15 more (compare)

  • Oct 13 2020 20:57

    rlabbe on v1.1

    (compare)

  • Oct 13 2020 20:32

    rlabbe on master

    Issue #208 assert comment name… (compare)

  • Oct 13 2020 18:56

    rlabbe on master

    Added comments for how to curre… (compare)

  • Oct 13 2020 18:54

    rlabbe on master

    Issue #339 Simplied code for in… (compare)

  • Oct 13 2020 16:58

    rlabbe on master

    Fixed docstring for gaussian() … (compare)

  • Oct 13 2020 16:42

    rlabbe on master

    Issue #332 Fixed terrible word… (compare)

  • Oct 13 2020 16:21

    rlabbe on master

    Issue #338 Fixed error in usin… (compare)

  • Oct 13 2020 16:17

    rlabbe on master

    Issue #343 Measurements were b… (compare)

  • Oct 13 2020 15:47

    rlabbe on master

    Issue #352 axis labels were ha… (compare)

  • Oct 13 2020 15:35

    rlabbe on master

    Fix for #377 Somehow this didn… (compare)

Kishan
@kishb87
Does anyone know of smoothing techniques that can be used for a EKF? It looks like all the smoothing techniques in FilterPy are linear problems. Am I right?
Suresh Sharma
@sursha
Hi @rlabbe! Just getting started with your book, thanks for your work!
alimohebbi
@alimohebbi
hi
i wonna run a code about robot motion using ekf
code are in the book
but get this erros
File "/usr/local/lib/python2.7/dist-packages/scipy/_lib/_util.py", line 231, in _asarray_validated
raise ValueError('object arrays are not supported')
ValueError: object arrays are not supported
what should i do?
Matthias
@maluethi
@rlabbe: I am reading through your book online and so far it has been amazing, thanks for publishing all this!
Although I did not read all chapters yet, it came to my understanding that outliers (measurements containing no information) are pretty bad for a 'standard' Kalman filter. By just searching trough the pdf version I didn't find any mention of outliers. So I was wondering if you could point me to any resources that introduce outlier handling within kalman filtering? Of course I googled but I cannot really judge the quality for a beginner of all the different results I get back...
Roger Labbe
@rlabbe
@maluethi , sorry, I haven't logged in here for a long time.
I intend to add a section on just this topic in a few day. In the meantime, I suggest looking up 'mahalanobis distance', which is a measure of how far a measurement is from the KF's prior. You can use this to 'gate' your data - discard data that is "too far away". Theory says throw away anything > 3 std, but in practice you may find 4,5, even 6 std to be a better gating distance
If you throw the data away, you just don't call update for that time period. You will thus call predict twice in a row, and your estimate will gain uncertainty because you did 2 predictions in a row
that's the general idea. The search term "kalman filter gating" is also a fruitful search.
Roger Labbe
@rlabbe
@maluethi the book is updated with a section on outliers. see chapter 8.
Matthias
@maluethi
@rlabbe Thanks a lot for letting me know. I will certainly look into the new chapter!
noemecd
@noemecd
I wish to congratulate you for your excellent book on Kalman and Bayesian filters.
It is clear, didactic and well-documented.
I had to build my very first Kalman filter in a quite complex configuration (7 state variables, strong non-linearities and very low signal to noise ratio).
Your book has brought me tremendous help in doing that - although, as a Scilab user, I have found the recourse to Python more troublesome than helpful -.
It is rare to find such a thorough, simple, user-oriented while scientifically sound presentation of the Kalman filter.
I think you are a born pedagogue.
Lots of thanks.
Miguel Oyarzun
@Miguel-O-Matic
@rlabbe I just started working through the Jupyter notebooks. Your intuitive approach to the subject matter is very refreshing. I do have a question/observation about the material in Chapter 3... I find the discussion of the product vs sum of "Gaussians" a bit confusing. It seems that you are discussing the sum of Gaussian random "variables" and the product of Gaussian probability "distributions". The sum of two independent Gaussian random variables is also Gaussian-distributed. The product of two Gaussian random variables is not, in general, Gaussian-distributed.
Miguel Oyarzun
@Miguel-O-Matic

@rlabbe I just started working through the Jupyter notebooks. Your intuitive approach to the subject matter is very refreshing. I do have a question/observation about the material in Chapter 3... I find the discussion of the product vs sum of "Gaussians" a bit confusing. It seems that you are discussing the sum of Gaussian random "variables" and the product of Gaussian probability "distributions". The sum of two independent Gaussian random variables is also Gaussian-distributed. The product of two Gaussian random variables is not, in general, Gaussian-distributed.

Having now made it through Chapter 4... I think the source of the confusion is that, in both cases, we are really talking about operations on Gaussian "distributions" rather than random "variables" . The mathematical operation involved in the "prediction" step is really a convolution, rather than a sum, of Gaussian "distributions", which can be shown to be a Gaussian "distribution" with mean and variance as described in Chapter 3. At least, that's what I think after reading Chapter 4... Looking forward to further enlightenment in the upcoming chapters... :-)

han-so1omon
@han-so1omon
Thank you for writing this book. I hope it is valuable to students, as I wish that I had it when I was a student.
Douglas Daly
@douglas-daly_gitlab
Wow - what an amazing book! You offer a very clear and conceptual approach to these topics without glossing over the foundations. And of course, presenting as Jupyter notebooks is a huge help and removes the mysteries of tuning parameters and such.
gabrielegranello
@gabrielegranello
Thank you very much for providing the library and the book, both have been of immense value for me to understand how Kalman filters work.
dangle1
@dangle1
Is there anyone here interested in forming a study group for this book? I'm working my way through it and understand it for the most part, but think studying together would make learning better.
Rachel Cohen Yeshurun
@rachelyeshurun
Wow, just amazing notebooks, hope one day to work through this. In the meantime, this notebook series is my inspiration for my own work.. Question for you @rlabbe or anyone who knows: I haven't been able to find out how to show all output on startup. I see all your notebooks when rendered on Binder too, show up with all the plots rendered, and no cell output number. It looks so clean, how do you do that??? My cells just show the code. Markdown is rendered, but code is not run, even though I ran all and saved before commit.
MichaelHay42
@MichaelHay42
Hi @Rlabbe, just starting to read your book (in chapter 1) it is FANTASTIC, i am learning so much, it is a delight! About filters yes, but also I didn't know I could run Jupyter notebooks on Azure and Binder. Very cool! THANK YOU so much!!
laurelstrelzoff
@laurelstrelzoff
Hi @rlabbe , I'm trying to implement a particle filter (SIR) like the one in chapter 12, but I'm having difficulty with it. I'm using it for bearings only tracking of multiple targets to differentiate between true and false targets, and I haven't been able to adapt that code to work without landmarks. Do you have any advice?
fanggenzaiXHBS
@fanggenzaiXHBS
Hi, @rlabbe , you call the difference between the measurement and prediction is called the residual, I don't quite agree with you. I mean the difference between the measurement and prediction is called the innovation.the difference between the measurement and posterior is called residual.