Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Randy Mackay
    @rmackay9
    @thien94, @patrickpoirier51 (and anyone else who is interested), I realised that our APSync scripts are hardcoded to grab librealsense 2.33.1. https://github.com/ArduPilot/companion/blob/master/Common/Ubuntu/librealsense/install_librealsense.sh#L16
    I think using a stable release instead of some random version (aka master) is probably a good idea but I'll have to remember from now on to update this version as Intel releases improvements.
    I was also thinking that it might be good if the scripts were enhanced to show the firmware version being used. I guess for the T265 it's somehow updated automatically so the librealsense SDK version always matches the camera. This may be different for other cameras. Anyway, not urgent at all just a thought..
    Ed Muthiah
    @ed-muthiah
    @rmackay9 out of curiosity who or which team at ANU are you working with? Rob Mahony?
    Randy Mackay
    @rmackay9
    @ed-muthiah, yes, it's Rob Mahony. He did a presentation at our developer unconference this year. https://www.youtube.com/watch?v=nO_y6BRBBOg
    Ed Muthiah
    @ed-muthiah
    @rmackay9 awesome thanks for sharing :) I'm starting post grad study at the ANU and he's the robotics lecturer!
    Randy Mackay
    @rmackay9
    @ed-muthiah, cool!
    apinxiko
    @apinxiko
    Hi, I found this: IntelRealSense/librealsense#6712 Has anyone run into the same problem?
    paulzpz
    @paulzpz
    The T265 camera works with raspberry 4. I connect the raspberry camera v. 2.1 and display the HDMI preview. After a minute or two, the t265 camera disconnects. Without a connected camera RPI v. 2.1, everything works fine.
    What could be the reason?
    Thien Nguyen
    @thien94
    @paulzpz Would it work if you connect the RPI camera without the display?
    paulzpz
    @paulzpz
    When you turn on the preview of the RPI camera after a while, disconnect the T265 camera. It does not depend on whether or not an HDMI device is connected.
    Thien Nguyen
    @thien94
    You can run realsense-viewer to see all messages from librealsense and inspect the problem. If it's something specific, it would be better to create an issue on their repo https://github.com/IntelRealSense/librealsense
    Vinicius Juvinski
    @juvinski
    Hi Randy, I think this version of librealsense is precompiled right ? after the weekend testing, I really don't have more the usb issue on jetson after compiling with the TM2 . how I'm using on a rover, I ran for at least 8 hours - with time to swap and charge batteries and no issue with usb - just when the battery goes bellow 10.8 volts. One thing I'm trying to understand is the camera after booting the jetson is not detected by the kernel, nothing on dmesg, and sometimes this is not an issue, starting the t265_to_mavlink works fine, sometimes I had to shutdown the jetson, remove the cable, boot and just after that plug the camera. This was happening more when I was using the cable provivde by intel, after swaping the cable to a Wstern digital cable I until now just had 1 time this issue.
    I saw someone testing with rover, I cant do guided mode on rover, there are something I can read or study to use with rover? is my first try with rover
    apinxiko
    @apinxiko
    @thien94 Thanks for your advice. Yes, you're right. Thanks to Randy's approach, librealsense-2.35.2 with Raspberry PI 4 4GB is working just fine also while in booting.
    Jee88
    @Jee88
    Hello
    I have tested on Rp4 with randy's apsync.
    Do you have an apsync image with ROS package installed to see rviz?
    I can't run sudo apt-get install ros-$ROS_VER-realsense2-camera`.
    Randy Mackay
    @rmackay9
    @juvinski, @RoboBil is also testing with Rover. I have not specifically tested Rover but it should work. You've setup the T265 and ArduPilot according to the wiki? https://ardupilot.org/rover/docs/common-vio-tracking-camera.html. Some key points, it's best to use Rover-4.1.0-dev and enable the EKF3 (explained on this wiki page). You'll need to set the EKF origin which can be done easily using the Mission Planner (also explained on the wiki page). It is easiest if you use an RPI4 with the standard APSync image to remove the chance of issues with the companion computer but if that's not possible, it's best to provide an onboard log (from the autopilot) so we can see if the companion computer is correctly sending the required messages (we will look a the VISP and VISV messages).
    Vinicius Juvinski
    @juvinski
    Hi @rmackay9 , yes, I followed the wiki and ekf3. I was testing the d435 on a RvR and I fight with problems of depth measurement on objects on the ground. I was testing today and I saw even with right confidence level variations and jumps on the messages, I opened an issue at intel's realsense git. I will do more tests tomorow and I wll post the logs for you - how I was checking the stability of the t265 the log is hudge, so i will do a smal test - I could ran missions without a problem - instead the jump message, so I will test this and send the log :)
    gauravshukla914
    @gauravshukla914
    Hi. I'm still very new to ROS and Pixhawk. I'm currently working on my drone project. Currently I have been given a task to send ROS data to Pixhawk flight controller. So I'm able to run the 2d Drone Navigation code in the simulated environment, where I'm able to acheive mapping, localization and path planning.
    However, now I want to send the odometry data (ground_truth/state) to my pixhawk flight controller. When I do rostopic echo/ground_truth/state - I can see the pose, position, covariance and twist data. And I need to send this data to my flight controller. Please if anyone can suggest, it'd be really helpful for me. Thank you :)
    patrickpoirier51
    @patrickpoirier51
    @gauravshukla914 I suggest you try on this channel https://gitter.im/mavlink/mavros
    Randy Mackay
    @rmackay9
    @gauravshukla914, I did some random googling and found that "ground_truth" is not a ros topic, it's a gazebo topic and you may need to use gazebo_ros. I don't know much really but I've never heard of this topic before.
    Randy Mackay
    @rmackay9
    Congratulations @thien94 (and @patrickpoirier51) on your progress with the Intel Depth Camera for object avoidance!! https://discuss.ardupilot.org/t/gsoc-2020-integration-of-ardupilot-and-realsense-d4xx-depth-camera-for-simple-obstacle-avoidance/58474
    Thien Nguyen
    @thien94
    Thank you @rmackay9 for your compliments! Also many thanks to @patrickpoirier51 for your guidance so far!
    patrickpoirier51
    @patrickpoirier51
    😊
    Randy Mackay
    @rmackay9
    @patrickpoirier51, I'm working on creating another apsync image with the update librealsense (2.35.2 instead of 2.33.1). While I'm updating it I would also like to make it easier for users to switch between 2.4Ghz and 5Ghz. One of the other testers said everything was fine for him on 2.4Ghz using the older APSync image. It looks like it's just the wifi.channel and wifi.operation-mode that I need to update. For 5Ghz it's "48" and "a" respectively. Do you know what these should be for 2.4Ghz?
    This was the commit I added to switch the APSync wifi accesspoint to be only 5Ghz: ArduPilot/companion@11c45d4
    @rmackay9 yes these are the parameters
    patrickpoirier51
    @patrickpoirier51

    Possible values:

    a: IEEE 802.11a (5 GHz)
    b: IEEE 802.11b (2.4 GHz)
    g: IEEE 802.11g (2.4 GHz)
    ad: IEEE 802.11ad (60 GHz)
    Default value: g

    As for frequencies, looking at WIKI
    there are numerous choices. WIFI-AP default to 6
    I choosed 48 based on local scan on my area
    image.png
    But if you look at the wiki table, 48 should be indoor only in many countries ;-)
    Randy Mackay
    @rmackay9
    ok thanks. In the APsync image I'm creating now I'll leave the channel as the default, "6"
    Randy Mackay
    @rmackay9
    .. well, actually that didn't seem to work well so I've put it back to channel 48
    Ed Muthiah
    @ed-muthiah
    Has anyone tried out the Up Board Xtreme i7 with the Intel Myriad VPU? Wondering how it would compare to an i7 NUC. Both seem like good options for the Realsense depth+tracking camera combo
    Randy Mackay
    @rmackay9

    I’ve released a new APSync image for the RPI4 + T265 which can be downloaded from https://firmware.ardupilot.org/Companion/apsync/beta/apsync-rpi-ubuntu-t265-20200703.img.xz

    Improvements over the previous beta:

    1. Intel Realsense libraries upgraded to v2.35.1 (was v2.33.2)
    2. Thien’s scripts updated to the latest version including these improvements:
        a. status messages including “T265: connecting to camera” inform user of connection problems
        b. position reset information passed from the camera to AP’s EKF
        c. vision-speed-estimate and vision-position-delta message support
        d. user sets EKF origin from the GCS map instead of using a hard-coded location
    3. Wifi AP scripts added to allow easier enabling, disabling and switching the band (2.4Ghz vs 5Ghz). Login to the RPI (login:apsync, pwd:apsync) and look at the start_wifi directory. run commands like “sudo ./start_wifi_2.4Ghz.sh”

    All testing is greatly appreciated!

    Randy Mackay
    @rmackay9
    I've gone ahead and (with Peter Barker's help) made this new apsync image the standard. i.e. the one that is linked from the T265 setup wiki page https://ardupilot.org/copter/docs/common-vio-tracking-camera.html
    patrickpoirier51
    @patrickpoirier51
    @rmackay9 thanks for all this
    Theses days you are doing T265 and I am into Avoidance.... have we switched places ? ;-)
    Brock DeWerff
    @dewerff23_twitter
    Hello all! I am an aerospace engineering intern at Northrop Grumman this summer, and for my intern project I was tasked with creating a quadcopter that could complete a couple of different functions in a GPS denied environment. To do this my team and I decided upon using a setup that involves a Raspberry Pi 4, Pixhawk Cube, and 3 Tf mini-s time of flight sensors. For the software side of things I personally adapted the t265_to_mavlink script to work in tandem with the time of flight sensors to send the VISION_POSITION_ESTIMATE message. Using mission planner I have confirmed that the cube is receiving these messages and that the global home is being set. However, I have noticed that after successive times of running my script the quadcopter’s local position will sometimes be more than 200 meters away from the 0,0,0 point set. When this happens the primary EKF gets set to the backup of 0 or 1 rather than 2 which I set in the parameters. I’m assuming this is what is also preventing us from switching into auto or guided mode after the quad arms in stabilize. If anyone could help point me in the direction of the solution to this problem that would be greatly appreciated! Thank you for your time!
    lightpathproject
    @lightpathproject
    @rmackay9 unable to arm with this new image. “PreArm Logging Failed”
    I am able to see vision position estimate messages through mavlink inspector.
    Randy Mackay
    @rmackay9
    @lightpathproject, there may be a problem with the SD card on the autopilot. It may help to format the card or better yet just replace it.
    @dewerff23_twitter, welcome. It's probably best to provide an onboard log but I guess there's a problem with the vision-position-estimate message being sent in. Perhaps it's update rate is too low or perhaps sometimes it's giving unreasonable values and the EKF if deciding to ignore it. I've also seen the EKF lane switch when there is bad position information provided, we haven't gotten to the bottom of why this happens but in any case, all lanes will be receiving the vision-position-estimate message so there shouldn't be much difference between the position estimates they come up with.
    Brock DeWerff
    @dewerff23_twitter
    I’ll have to ask around to see if it is acceptable to release the EK2 logs. When I use MAVlink inspector it shows the VPE message being sent around 27-31 hz. I will say that it was advised to make the takeoff and landing milestone a one dimensional problem, so only the downfacing sensor is inputting data. The other two are just returning 0’s. Could that be considered unreasonable? Also, this issue seems to be resolved when I reboot the pixhawk cube, but it still says variance is preventing the cube from entering auto mode.
    Randy Mackay
    @rmackay9
    @dewerff23_twitter, the EKF won't have a good horizontal position estimate if only altitude is provided in the vision-position-estimate messages and all the autonomous modes require a good position estimate to work. Auto mode isn't happy only controlling one access - there are various checks to ensure it has a good 3D position estimate.
    By the way, the EKF can be directly setup to use a range finder for its altitude estimate during takeoff and landing. The way to do this is to set the EKx_RNG_USE_HGT and EKx_RNG_USE_SPD parameters. In general I don't recommend this and this probably isn't helpful if your longer term goal is to provide a 3D position estimate based on lidar.