Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Vinicius Juvinski
    @juvinski
    Hi Randy, I think this version of librealsense is precompiled right ? after the weekend testing, I really don't have more the usb issue on jetson after compiling with the TM2 . how I'm using on a rover, I ran for at least 8 hours - with time to swap and charge batteries and no issue with usb - just when the battery goes bellow 10.8 volts. One thing I'm trying to understand is the camera after booting the jetson is not detected by the kernel, nothing on dmesg, and sometimes this is not an issue, starting the t265_to_mavlink works fine, sometimes I had to shutdown the jetson, remove the cable, boot and just after that plug the camera. This was happening more when I was using the cable provivde by intel, after swaping the cable to a Wstern digital cable I until now just had 1 time this issue.
    I saw someone testing with rover, I cant do guided mode on rover, there are something I can read or study to use with rover? is my first try with rover
    apinxiko
    @apinxiko
    @thien94 Thanks for your advice. Yes, you're right. Thanks to Randy's approach, librealsense-2.35.2 with Raspberry PI 4 4GB is working just fine also while in booting.
    Jee88
    @Jee88
    Hello
    I have tested on Rp4 with randy's apsync.
    Do you have an apsync image with ROS package installed to see rviz?
    I can't run sudo apt-get install ros-$ROS_VER-realsense2-camera`.
    Randy Mackay
    @rmackay9
    @juvinski, @RoboBil is also testing with Rover. I have not specifically tested Rover but it should work. You've setup the T265 and ArduPilot according to the wiki? https://ardupilot.org/rover/docs/common-vio-tracking-camera.html. Some key points, it's best to use Rover-4.1.0-dev and enable the EKF3 (explained on this wiki page). You'll need to set the EKF origin which can be done easily using the Mission Planner (also explained on the wiki page). It is easiest if you use an RPI4 with the standard APSync image to remove the chance of issues with the companion computer but if that's not possible, it's best to provide an onboard log (from the autopilot) so we can see if the companion computer is correctly sending the required messages (we will look a the VISP and VISV messages).
    Vinicius Juvinski
    @juvinski
    Hi @rmackay9 , yes, I followed the wiki and ekf3. I was testing the d435 on a RvR and I fight with problems of depth measurement on objects on the ground. I was testing today and I saw even with right confidence level variations and jumps on the messages, I opened an issue at intel's realsense git. I will do more tests tomorow and I wll post the logs for you - how I was checking the stability of the t265 the log is hudge, so i will do a smal test - I could ran missions without a problem - instead the jump message, so I will test this and send the log :)
    gauravshukla914
    @gauravshukla914
    Hi. I'm still very new to ROS and Pixhawk. I'm currently working on my drone project. Currently I have been given a task to send ROS data to Pixhawk flight controller. So I'm able to run the 2d Drone Navigation code in the simulated environment, where I'm able to acheive mapping, localization and path planning.
    However, now I want to send the odometry data (ground_truth/state) to my pixhawk flight controller. When I do rostopic echo/ground_truth/state - I can see the pose, position, covariance and twist data. And I need to send this data to my flight controller. Please if anyone can suggest, it'd be really helpful for me. Thank you :)
    patrickpoirier51
    @patrickpoirier51
    @gauravshukla914 I suggest you try on this channel https://gitter.im/mavlink/mavros
    Randy Mackay
    @rmackay9
    @gauravshukla914, I did some random googling and found that "ground_truth" is not a ros topic, it's a gazebo topic and you may need to use gazebo_ros. I don't know much really but I've never heard of this topic before.
    Randy Mackay
    @rmackay9
    Congratulations @thien94 (and @patrickpoirier51) on your progress with the Intel Depth Camera for object avoidance!! https://discuss.ardupilot.org/t/gsoc-2020-integration-of-ardupilot-and-realsense-d4xx-depth-camera-for-simple-obstacle-avoidance/58474
    Thien Nguyen
    @thien94
    Thank you @rmackay9 for your compliments! Also many thanks to @patrickpoirier51 for your guidance so far!
    patrickpoirier51
    @patrickpoirier51
    😊
    Randy Mackay
    @rmackay9
    @patrickpoirier51, I'm working on creating another apsync image with the update librealsense (2.35.2 instead of 2.33.1). While I'm updating it I would also like to make it easier for users to switch between 2.4Ghz and 5Ghz. One of the other testers said everything was fine for him on 2.4Ghz using the older APSync image. It looks like it's just the wifi.channel and wifi.operation-mode that I need to update. For 5Ghz it's "48" and "a" respectively. Do you know what these should be for 2.4Ghz?
    This was the commit I added to switch the APSync wifi accesspoint to be only 5Ghz: ArduPilot/companion@11c45d4
    @rmackay9 yes these are the parameters
    patrickpoirier51
    @patrickpoirier51

    Possible values:

    a: IEEE 802.11a (5 GHz)
    b: IEEE 802.11b (2.4 GHz)
    g: IEEE 802.11g (2.4 GHz)
    ad: IEEE 802.11ad (60 GHz)
    Default value: g

    As for frequencies, looking at WIKI
    there are numerous choices. WIFI-AP default to 6
    I choosed 48 based on local scan on my area
    image.png
    But if you look at the wiki table, 48 should be indoor only in many countries ;-)
    Randy Mackay
    @rmackay9
    ok thanks. In the APsync image I'm creating now I'll leave the channel as the default, "6"
    Randy Mackay
    @rmackay9
    .. well, actually that didn't seem to work well so I've put it back to channel 48
    Ed Muthiah
    @ed-muthiah
    Has anyone tried out the Up Board Xtreme i7 with the Intel Myriad VPU? Wondering how it would compare to an i7 NUC. Both seem like good options for the Realsense depth+tracking camera combo
    Randy Mackay
    @rmackay9

    I’ve released a new APSync image for the RPI4 + T265 which can be downloaded from https://firmware.ardupilot.org/Companion/apsync/beta/apsync-rpi-ubuntu-t265-20200703.img.xz

    Improvements over the previous beta:

    1. Intel Realsense libraries upgraded to v2.35.1 (was v2.33.2)
    2. Thien’s scripts updated to the latest version including these improvements:
        a. status messages including “T265: connecting to camera” inform user of connection problems
        b. position reset information passed from the camera to AP’s EKF
        c. vision-speed-estimate and vision-position-delta message support
        d. user sets EKF origin from the GCS map instead of using a hard-coded location
    3. Wifi AP scripts added to allow easier enabling, disabling and switching the band (2.4Ghz vs 5Ghz). Login to the RPI (login:apsync, pwd:apsync) and look at the start_wifi directory. run commands like “sudo ./start_wifi_2.4Ghz.sh”

    All testing is greatly appreciated!

    Randy Mackay
    @rmackay9
    I've gone ahead and (with Peter Barker's help) made this new apsync image the standard. i.e. the one that is linked from the T265 setup wiki page https://ardupilot.org/copter/docs/common-vio-tracking-camera.html
    patrickpoirier51
    @patrickpoirier51
    @rmackay9 thanks for all this
    Theses days you are doing T265 and I am into Avoidance.... have we switched places ? ;-)
    Brock DeWerff
    @dewerff23_twitter
    Hello all! I am an aerospace engineering intern at Northrop Grumman this summer, and for my intern project I was tasked with creating a quadcopter that could complete a couple of different functions in a GPS denied environment. To do this my team and I decided upon using a setup that involves a Raspberry Pi 4, Pixhawk Cube, and 3 Tf mini-s time of flight sensors. For the software side of things I personally adapted the t265_to_mavlink script to work in tandem with the time of flight sensors to send the VISION_POSITION_ESTIMATE message. Using mission planner I have confirmed that the cube is receiving these messages and that the global home is being set. However, I have noticed that after successive times of running my script the quadcopter’s local position will sometimes be more than 200 meters away from the 0,0,0 point set. When this happens the primary EKF gets set to the backup of 0 or 1 rather than 2 which I set in the parameters. I’m assuming this is what is also preventing us from switching into auto or guided mode after the quad arms in stabilize. If anyone could help point me in the direction of the solution to this problem that would be greatly appreciated! Thank you for your time!
    lightpathproject
    @lightpathproject
    @rmackay9 unable to arm with this new image. “PreArm Logging Failed”
    I am able to see vision position estimate messages through mavlink inspector.
    Randy Mackay
    @rmackay9
    @lightpathproject, there may be a problem with the SD card on the autopilot. It may help to format the card or better yet just replace it.
    @dewerff23_twitter, welcome. It's probably best to provide an onboard log but I guess there's a problem with the vision-position-estimate message being sent in. Perhaps it's update rate is too low or perhaps sometimes it's giving unreasonable values and the EKF if deciding to ignore it. I've also seen the EKF lane switch when there is bad position information provided, we haven't gotten to the bottom of why this happens but in any case, all lanes will be receiving the vision-position-estimate message so there shouldn't be much difference between the position estimates they come up with.
    Brock DeWerff
    @dewerff23_twitter
    I’ll have to ask around to see if it is acceptable to release the EK2 logs. When I use MAVlink inspector it shows the VPE message being sent around 27-31 hz. I will say that it was advised to make the takeoff and landing milestone a one dimensional problem, so only the downfacing sensor is inputting data. The other two are just returning 0’s. Could that be considered unreasonable? Also, this issue seems to be resolved when I reboot the pixhawk cube, but it still says variance is preventing the cube from entering auto mode.
    Randy Mackay
    @rmackay9
    @dewerff23_twitter, the EKF won't have a good horizontal position estimate if only altitude is provided in the vision-position-estimate messages and all the autonomous modes require a good position estimate to work. Auto mode isn't happy only controlling one access - there are various checks to ensure it has a good 3D position estimate.
    By the way, the EKF can be directly setup to use a range finder for its altitude estimate during takeoff and landing. The way to do this is to set the EKx_RNG_USE_HGT and EKx_RNG_USE_SPD parameters. In general I don't recommend this and this probably isn't helpful if your longer term goal is to provide a 3D position estimate based on lidar.
    Bo Shang
    @cnpcshangbo
    Hi @thien94 , I am trying to compile your project vision_to_mavros, however Project 'cv_bridge' can not find opencv. Have you encountered some issues like this? Do you know which environment variable should I set? ros-perception/vision_opencv#345
    gauravshukla914
    @gauravshukla914
    Hey guys. I want to fly the drone indoors through dronekit. Take off, reaching a respective altitude, then fly forward backward left and right. I can achieve all this through drone-kit python with GPS enabled. However, to fly indoors, I can't rely on GPS. How should I approach this? I read that RC overrride is not recommended at all. Is there any other alternative to it?
    Stephen Dade
    @stephendade
    @gauravshukla914: Use GUIDED mode and push position setpoints to the vehicle. You'll need in indoor navigation system too.
    Thien Nguyen
    @thien94
    @cnpcshangbo Have you tried to install the package via apt-get?
    sudo apt-get install ros-(ROS version name)-cv-bridge
    sudo apt-get install ros-(ROS version name)-vision-opencv
    @gauravshukla914 just to add to @stephendade 's answer, here's a list of non-gps positioning system for your reference https://ardupilot.org/copter/docs/common-non-gps-navigation-landing-page.html
    Bo Shang
    @cnpcshangbo
    Hi @thien94, I have installed those two packages with apt-get. Do they contain the installation of opencv or I need some environmental variable to let catkin_make know where opencv is? I am using Jetson. It is said that Jetson has some bugs in opencv configurations.
    Randy Mackay
    @rmackay9
    @gauravshukla914, our list of possible methods for Non-GPS navigation are here: https://ardupilot.org/copter/docs/common-non-gps-navigation-landing-page.html. I think the Intel RealSense T265 is the best performing option for most use cases.
    Thien Nguyen
    @thien94
    @cnpcshangbo For Jetson you will probably have to build the CUDA version of OpenCV. Maybe these solutions could help: link 1, link 2.
    However, if you don't plan to use the opencv-related node, you can just comment out the cv_bridge dependency as well as the related cpp file in the CMake file.
    apinxiko
    @apinxiko
    Hi @thien94 @patrickpoirier51 @rmackay9 I’ve been testing down-facing mounted T265 with librealsense-build-2.35.2 library to improve existing issues with downward-facing configuration, but still "AP_VisualOdom_IntelT265::pre_arm_check" tends to be triggered as a results either of “if (!healthy())” or “if (yaw_diff_deg > 10.0f)”, respectively, while the vehicle is in arming checks. Any ideas to minimize those risks?
    Thien Nguyen
    @thien94
    You should check the T265's data, specifically the tracking confidence and position + attitude when the vehicle is static.
    If the T265 is not working properly, you may need to re-adjust the T265's position on the frame, and test with different environments. The T265 will probably not work well if it's too close to the ground, or there are too few visual features in the scene.
    Randy Mackay
    @rmackay9
    @apinxiko, If you've got the compass enabled then the pre-arm failing is likely because the camera and the compass aren't pointing in the same direction which will cause troubles during loiter, etc. You can re-align the camera's yaw to the compass's yaw with an auxiliary switch by setting RCx_OPTION = 80 (Viso Align). Then move the switch to the high position before arming and a message will be displayed on the HUD with some info on how much of an adjustment it made.. the amount it adjusts the yaw is not really important.. but at least you'll know that they're aligned
    rge99
    @rebenstein
    I have been away for a while - has there been any progress in integrating the T265 for object avoidance in ArduCopter?
    Bo Shang
    @cnpcshangbo
    Hi @thien94, I’ll try to comment out the cv_bridge dependency and the related cpp file in CMake file. Thanks for your suggestions.