I was stuck on this implementation for months. I kept tweaking the code but did not get it to work until Dieter Fox
visited our department. He helped me tune the Kalman filters and presto everything worked. It turns out that the
observation models should have a very large error. This allows the observations to match landmarks that are relatively
far away. Since all of the particles match at some level you get a nice distribution from which to sample and the particle
filter does not converge to a badly incorrect answer easily. But from a sensor standpoint you are saying, "I see a tree five feet away, but I think it could be as far away as 50 feet." I don't think I would have ever tried those values without his help. So, thank you again Dieter if you happen to read this.
The demo below is from a simulated run. I have also run this on the Victoria Park data set. That data set is large and
I do not want to put it on my bandwidth limited site. If you want the VP data formatted for this program, please email me.
In the applet below, blue circles are landmarks that the robot is currently seeing. Orange circles are landmarks that the robot
has placed in its map. The display jitters a lot because it only displays the best particle at the time. Different paths
take turns being the best particle so what you are seeing is different best theories about where the robot is and where it has been.
You can pause the program and cycle through all of the current views if you wish.
This applet requires Java 1.5. I haven't tested this applet code on all systems. Please let me know if you cannot get it to working by messaging alan at this domain. Source Code
This project requires Oursland Utility Classes.