jlblancoc on further-3rdparty-refactor
Fixes for msvc (compare)
jlblancoc on refactor-3d-nanogui
WIP: integrate nanogui as depen… (compare)
jlblancoc on refactor-3d-nanogui
WIP: integrate nanogui as depen… (compare)
jlblancoc on refactor-3d-nanogui
Add nanogui as external project WIP: integrate nanogui as depen… WIP move to nanogui-sdl (compare)
jlblancoc on further-3rdparty-refactor
GLUT moved to 3rdparty directory (compare)
jlblancoc on further-3rdparty-refactor
GLUT moved to 3rdparty directory (compare)
jlblancoc on further-3rdparty-refactor
GLUT moved to 3rdparty directory (compare)
bergercookie
https://en.wikipedia.org/wiki/Holonomic_(robotics)
Anyway, there are navigation algorithms in MRPT that can help you navigate from A -> B, given some sensor input. I guess the most popular one would be the reactive navigator: https://www.mrpt.org/tag/reactive-navigation/
bergercookie
If you 're using ROS, MRPT also offers ROS wrappers: https://github.com/mrpt-ros-pkg/mrpt_navigation
bergercookie
As far as I know though, either the reactive nav2d or the localisation nodes operate using 2D laser scans
bergercookie
Is it a set of given (x,y) coords?
bergercookie
OK, but how do you reach that (x,y) representation to feed to your algoirthm?
If you have that (x,y) representation and your estimated position from your GPS-like system then you can implement simple PID controllers for the x,y (and theta?) of your robot to follow those waypoints
bergercookie
Otherwise you have to "sense" where the line is relative to the robot and have some sort of feedback control loop to follow that line.