I've started working on a new EKF feature to allow switching between the external nav (i.e. T265 or similar) and GPS for the position. It's very much still a work-in-progress but the code is here if anyone is interested: https://github.com/rmackay9/rmackay9-ardupilot/commits/ekf-pos-source1. At the moment the position source can be manually changed using an auxiliary switch but eventually I hope to make it automatic based on some heuristics of how reliable the two sources are.
I've discussed this with Tridge and PaulR already and have their agreement that the approach is OK. There are some open discussions including:
what is the minimum feature set we need before it can be merged? i.e. is switching between external and GPS enough or do we want a bigger change?
Tridge suggests we should move the sensor buffers into this new class.
Paul wants the altitude (aka Z-axis) moved into the new class to allow more defined behaviour of switching between the sources
@rmackay9 is this work complimentary to the GSoC project with @tridge ?
@patrickpoirier51, Txs for that, I had not considered this. I've now discussed with Tridge and we think they are complimentary and shouldn't conflict too badly. So the GSoC sensor affinity project allows the 1st and 2nd GPS (for eample) to be used by different EKF lanes while my new class would allow specifying whether to use GPS or External Nav for both lanes. Discussing this makes it clear that it should also be possible to have one EKF lane using GPS and the other using External Nav.. this is where the two projects meet perhaps.
@rmackay9 that’s a very promising feature. We’ve got a Ros controlled mower that’s currently only using Ardupilot for actuating servos. Being able to fall back on ardurover gps pose/position and use geofence would be great failsafes
@rmackay9 Yes I am sending yaw values as well as position. I also send linear velocity when providing LOCAL coordinate setpoints (I know this is somewhat out to the scope of my question but I was hoping to do the same in BODY coordinates when thie current issue is fixed)
While I complete some overdue maintenance on my rover (before I UL my logs) , I dId notice a message "POSE JUMP DETECTED". Could this be a reason for my home position shift?
Talking about EKF, there's new ROS package called fuse, succesor of robot_localization. Must have for those interested in special motion models like e.g. ackerman. https://github.com/locusrobotics/fuse
Hi guys, I update my jetson with last nvidia image - ubuntu 18.04.4 and both cameras - d435 and t265 is reseting the usb , anyone has any clue of what can be happening ?
@stam9_gitlab, ok, I think the issue may be in the yaw targets that are being sent. I cannot see what values are being sent in the logs but may be you could check those. Make sure that the values are in radians... maybe just don't send them at all and see if it works better.
@RoboBil, yes, the "pose jump" message means that Thien's scripts noticed that the position was moving impossibly quickly. This points towards a camera issue and is probably not an ArduPilot or script issue.
hi Patrick, 2 hours running without problems - the battery was drop to 10,7 volts and then I had the power reset, so I think compiling realsense with this -BUILD_WITH_TM2=True on nano or other jetson boards can help
I´m testing with rover, I was doing some things today but I will test tomorow and return to you
Sorry Vin I really don't get it
"...Assuming you received from Intel an engineering sample of T265(TM2) device, enable support for it in the SDK.."
TM2 stands for a very particular case?
Oh...on the installation step 5 TM1-specific:
Tracking Module requires hid_sensor_custom kernel module to operate properly. Due to TM1's power-up sequence constraints, this driver is required to be loaded during boot for the HW to be properly initialized.
Interesting i saw in a lot of places to dont patch fhe kernel to use the realsense cameras in the realsense page about nano they just tell to install using pre compiled packages
Im runing more one hour and half without usb reset
@Huibean, that's very interesting. We are hoping to work with ANU (Australian National University) who is also working on an visual odometry system (open source of course) and Tridge suggested it would be good to be able to get it working in parallel with the T265's built-in algorithm so we can compare performance. I'll pass a link to ANU.
I think using a stable release instead of some random version (aka master) is probably a good idea but I'll have to remember from now on to update this version as Intel releases improvements.
I was also thinking that it might be good if the scripts were enhanced to show the firmware version being used. I guess for the T265 it's somehow updated automatically so the librealsense SDK version always matches the camera. This may be different for other cameras. Anyway, not urgent at all just a thought..
@rmackay9 out of curiosity who or which team at ANU are you working with? Rob Mahony?
The T265 camera works with raspberry 4. I connect the raspberry camera v. 2.1 and display the HDMI preview. After a minute or two, the t265 camera disconnects. Without a connected camera RPI v. 2.1, everything works fine. What could be the reason?
Hi Randy, I think this version of librealsense is precompiled right ? after the weekend testing, I really don't have more the usb issue on jetson after compiling with the TM2 . how I'm using on a rover, I ran for at least 8 hours - with time to swap and charge batteries and no issue with usb - just when the battery goes bellow 10.8 volts. One thing I'm trying to understand is the camera after booting the jetson is not detected by the kernel, nothing on dmesg, and sometimes this is not an issue, starting the t265_to_mavlink works fine, sometimes I had to shutdown the jetson, remove the cable, boot and just after that plug the camera. This was happening more when I was using the cable provivde by intel, after swaping the cable to a Wstern digital cable I until now just had 1 time this issue.
I saw someone testing with rover, I cant do guided mode on rover, there are something I can read or study to use with rover? is my first try with rover
@thien94 Thanks for your advice. Yes, you're right. Thanks to Randy's approach, librealsense-2.35.2 with Raspberry PI 4 4GB is working just fine also while in booting.
Hello I have tested on Rp4 with randy's apsync. Do you have an apsync image with ROS package installed to see rviz? I can't run sudo apt-get install ros-$ROS_VER-realsense2-camera`.
@juvinski, @RoboBil is also testing with Rover. I have not specifically tested Rover but it should work. You've setup the T265 and ArduPilot according to the wiki? https://ardupilot.org/rover/docs/common-vio-tracking-camera.html. Some key points, it's best to use Rover-4.1.0-dev and enable the EKF3 (explained on this wiki page). You'll need to set the EKF origin which can be done easily using the Mission Planner (also explained on the wiki page). It is easiest if you use an RPI4 with the standard APSync image to remove the chance of issues with the companion computer but if that's not possible, it's best to provide an onboard log (from the autopilot) so we can see if the companion computer is correctly sending the required messages (we will look a the VISP and VISV messages).
Hi @rmackay9 , yes, I followed the wiki and ekf3. I was testing the d435 on a RvR and I fight with problems of depth measurement on objects on the ground. I was testing today and I saw even with right confidence level variations and jumps on the messages, I opened an issue at intel's realsense git. I will do more tests tomorow and I wll post the logs for you - how I was checking the stability of the t265 the log is hudge, so i will do a smal test - I could ran missions without a problem - instead the jump message, so I will test this and send the log :)
Hi. I'm still very new to ROS and Pixhawk. I'm currently working on my drone project. Currently I have been given a task to send ROS data to Pixhawk flight controller. So I'm able to run the 2d Drone Navigation code in the simulated environment, where I'm able to acheive mapping, localization and path planning. However, now I want to send the odometry data (ground_truth/state) to my pixhawk flight controller. When I do rostopic echo/ground_truth/state - I can see the pose, position, covariance and twist data. And I need to send this data to my flight controller. Please if anyone can suggest, it'd be really helpful for me. Thank you :)