Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Randy Mackay
    @rmackay9
    I've gone ahead and (with Peter Barker's help) made this new apsync image the standard. i.e. the one that is linked from the T265 setup wiki page https://ardupilot.org/copter/docs/common-vio-tracking-camera.html
    patrickpoirier51
    @patrickpoirier51
    @rmackay9 thanks for all this
    Theses days you are doing T265 and I am into Avoidance.... have we switched places ? ;-)
    Brock DeWerff
    @dewerff23_twitter
    Hello all! I am an aerospace engineering intern at Northrop Grumman this summer, and for my intern project I was tasked with creating a quadcopter that could complete a couple of different functions in a GPS denied environment. To do this my team and I decided upon using a setup that involves a Raspberry Pi 4, Pixhawk Cube, and 3 Tf mini-s time of flight sensors. For the software side of things I personally adapted the t265_to_mavlink script to work in tandem with the time of flight sensors to send the VISION_POSITION_ESTIMATE message. Using mission planner I have confirmed that the cube is receiving these messages and that the global home is being set. However, I have noticed that after successive times of running my script the quadcopter’s local position will sometimes be more than 200 meters away from the 0,0,0 point set. When this happens the primary EKF gets set to the backup of 0 or 1 rather than 2 which I set in the parameters. I’m assuming this is what is also preventing us from switching into auto or guided mode after the quad arms in stabilize. If anyone could help point me in the direction of the solution to this problem that would be greatly appreciated! Thank you for your time!
    lightpathproject
    @lightpathproject
    @rmackay9 unable to arm with this new image. “PreArm Logging Failed”
    I am able to see vision position estimate messages through mavlink inspector.
    Randy Mackay
    @rmackay9
    @lightpathproject, there may be a problem with the SD card on the autopilot. It may help to format the card or better yet just replace it.
    @dewerff23_twitter, welcome. It's probably best to provide an onboard log but I guess there's a problem with the vision-position-estimate message being sent in. Perhaps it's update rate is too low or perhaps sometimes it's giving unreasonable values and the EKF if deciding to ignore it. I've also seen the EKF lane switch when there is bad position information provided, we haven't gotten to the bottom of why this happens but in any case, all lanes will be receiving the vision-position-estimate message so there shouldn't be much difference between the position estimates they come up with.
    Brock DeWerff
    @dewerff23_twitter
    I’ll have to ask around to see if it is acceptable to release the EK2 logs. When I use MAVlink inspector it shows the VPE message being sent around 27-31 hz. I will say that it was advised to make the takeoff and landing milestone a one dimensional problem, so only the downfacing sensor is inputting data. The other two are just returning 0’s. Could that be considered unreasonable? Also, this issue seems to be resolved when I reboot the pixhawk cube, but it still says variance is preventing the cube from entering auto mode.
    Randy Mackay
    @rmackay9
    @dewerff23_twitter, the EKF won't have a good horizontal position estimate if only altitude is provided in the vision-position-estimate messages and all the autonomous modes require a good position estimate to work. Auto mode isn't happy only controlling one access - there are various checks to ensure it has a good 3D position estimate.
    By the way, the EKF can be directly setup to use a range finder for its altitude estimate during takeoff and landing. The way to do this is to set the EKx_RNG_USE_HGT and EKx_RNG_USE_SPD parameters. In general I don't recommend this and this probably isn't helpful if your longer term goal is to provide a 3D position estimate based on lidar.
    Bo Shang
    @cnpcshangbo
    Hi @thien94 , I am trying to compile your project vision_to_mavros, however Project 'cv_bridge' can not find opencv. Have you encountered some issues like this? Do you know which environment variable should I set? ros-perception/vision_opencv#345
    gauravshukla914
    @gauravshukla914
    Hey guys. I want to fly the drone indoors through dronekit. Take off, reaching a respective altitude, then fly forward backward left and right. I can achieve all this through drone-kit python with GPS enabled. However, to fly indoors, I can't rely on GPS. How should I approach this? I read that RC overrride is not recommended at all. Is there any other alternative to it?
    Stephen Dade
    @stephendade
    @gauravshukla914: Use GUIDED mode and push position setpoints to the vehicle. You'll need in indoor navigation system too.
    Thien Nguyen
    @thien94
    @cnpcshangbo Have you tried to install the package via apt-get?
    sudo apt-get install ros-(ROS version name)-cv-bridge
    sudo apt-get install ros-(ROS version name)-vision-opencv
    @gauravshukla914 just to add to @stephendade 's answer, here's a list of non-gps positioning system for your reference https://ardupilot.org/copter/docs/common-non-gps-navigation-landing-page.html
    Bo Shang
    @cnpcshangbo
    Hi @thien94, I have installed those two packages with apt-get. Do they contain the installation of opencv or I need some environmental variable to let catkin_make know where opencv is? I am using Jetson. It is said that Jetson has some bugs in opencv configurations.
    Randy Mackay
    @rmackay9
    @gauravshukla914, our list of possible methods for Non-GPS navigation are here: https://ardupilot.org/copter/docs/common-non-gps-navigation-landing-page.html. I think the Intel RealSense T265 is the best performing option for most use cases.
    Thien Nguyen
    @thien94
    @cnpcshangbo For Jetson you will probably have to build the CUDA version of OpenCV. Maybe these solutions could help: link 1, link 2.
    However, if you don't plan to use the opencv-related node, you can just comment out the cv_bridge dependency as well as the related cpp file in the CMake file.
    apinxiko
    @apinxiko
    Hi @thien94 @patrickpoirier51 @rmackay9 I’ve been testing down-facing mounted T265 with librealsense-build-2.35.2 library to improve existing issues with downward-facing configuration, but still "AP_VisualOdom_IntelT265::pre_arm_check" tends to be triggered as a results either of “if (!healthy())” or “if (yaw_diff_deg > 10.0f)”, respectively, while the vehicle is in arming checks. Any ideas to minimize those risks?
    Thien Nguyen
    @thien94
    You should check the T265's data, specifically the tracking confidence and position + attitude when the vehicle is static.
    If the T265 is not working properly, you may need to re-adjust the T265's position on the frame, and test with different environments. The T265 will probably not work well if it's too close to the ground, or there are too few visual features in the scene.
    Randy Mackay
    @rmackay9
    @apinxiko, If you've got the compass enabled then the pre-arm failing is likely because the camera and the compass aren't pointing in the same direction which will cause troubles during loiter, etc. You can re-align the camera's yaw to the compass's yaw with an auxiliary switch by setting RCx_OPTION = 80 (Viso Align). Then move the switch to the high position before arming and a message will be displayed on the HUD with some info on how much of an adjustment it made.. the amount it adjusts the yaw is not really important.. but at least you'll know that they're aligned
    rge99
    @rebenstein
    I have been away for a while - has there been any progress in integrating the T265 for object avoidance in ArduCopter?
    Bo Shang
    @cnpcshangbo
    Hi @thien94, I’ll try to comment out the cv_bridge dependency and the related cpp file in CMake file. Thanks for your suggestions.
    gauravshukla914
    @gauravshukla914
    Thank you @rmackay9 @thien94 @stephendade . If I use any of the listed methods for non-gps navigation, So will I have to make changes to my script in dronekit-python. Currently I arm the in guided mode, take off to a specified altitutude, then use velocity commands to travel north,south,east and west through, send_ned_velocity function. Or does the code remain the same?
    apinxiko
    @apinxiko
    Thanks a lot @thien94 @rmackay9 . The mounted height above the ground is approx. 13cm. Even though I realign camera’s yaw to compass’s before arming, EKF status indicator (GREEN) tends to get higher nearly to 0.5 and at the same time “not healthy” pre-arm check is then triggered. I am also skeptical about its UAVCAN based compass, which is having some issues reported in #12165 .
    Randy Mackay
    @rmackay9
    @rebenstein, yes the T265 integration is mostly complete but the camera only provides position (not distances) so it's only useful for non-GPS navigation. the wiki is here: https://ardupilot.org/copter/docs/common-vio-tracking-camera.html . Some key points: it's best to use an RPI4 and install the linked "APSync" image which should "just work". Instead of using ArduPilot-4.0.x, please try 4.1.x-dev (aka "latest") and enable the EKF3 (details on the wiki). This "latest" version hasn't gone through beta testing so it is more dangerous than using the stable version but it should work better with the T265.
    @rebenstein, Thien has published a blog here for using another Intel RealSense camera for object avoidance: https://discuss.ardupilot.org/t/gsoc-2020-integration-of-ardupilot-and-realsense-d4xx-depth-camera-for-simple-obstacle-avoidance/58474. This is still a work-in-progress but it's looking good so far!
    @gauravshukla914, I don't think you will need to make changes to the dronkit-python script you've written. The various non-GPS navigation methods help the EKF get a position estimate without a GPS but the control part of ArduPilot all remains the same.
    Randy Mackay
    @rmackay9
    @apinxiko, on the EKF Status screen, it's the "Compass" bar that keeps climbing above 0.5? That does make it sound like the heading is drifting around until it doesn't agree with the compass anymore. When the vehicle is on the ground and disarmed this should all be related to the gyros and the compass. Maybe try triggering a gyro calibration? I think this can be done by going to the MP's Data screen's Actions tab and selecting "PREFLIGHT_CALIBRATION" and pressing the "Do Action" button. Beyond this I think I'll need to see a log.
    Randy Mackay
    @rmackay9
    @cnpcshangbo, the "No SRTM datda for this area" is a Mission Planner issue I think. It's unrelated to the core ArduPilot flight code. It may help to zoom in more before selecting "set EKF origin". Also make sure that you can see the altitude on the PLAN screen as you move the cursor around. I'd also recommend setting the EKF origin to where the vehicle actually is. So don't, for example, try to set the EKF origin in the sea
    Ed Muthiah
    @ed-muthiah
    @patrickpoirier51 Hey Patrick, are you having this grainy image issue with Pi v2 Camera on your NX?
    patrickpoirier51
    @patrickpoirier51
    @ed-muthiah actually I havent tried it yet, I focus on getting the D435 working with the GSoC project
    Ian McElhenny
    @mcelhennyi
    Hey guys, any known issues with the SITL and guided mode? I am trying to arm in guided mode, with flow sensor and rangefinder enabled. I have disabled GPS on the ekf2 and as an input in general (set to none) but since I have no gps, the gps message i am seeing has 65535 as the hdop and gdop. So its too high, which is causing a APM: PreArm: High GPS HDOP error and it wont let me arm... is the SITL not good enough for flow+range finder only guided mode?
    The goal is of course to use a SLAM based vision approach but trying to get the sitl setup for the flow base case now.
    I also tried the same configs with EK3 and still get the hdop error
    Joel Fankam
    @joekrom
    @patrickpoirier51 . hi i have a question : does feeding the drone using the GPS_INPUT message (that means sending lat, long and yaw angle ) enough to perform autonomous indoor fly ?
    Bo Shang
    @cnpcshangbo
    image.png
    Hi @rmackay9 and @thien94 , thanks for your suggestions on the Mission Planner set EKF home issue. I have tried to use the python script to set origin. However, EKF 2 still refuses set origin. Is there something wrong with the data I sent?
    Randy Mackay
    @rmackay9
    @mcelhennyi, I think you'll just need to set the arming checks to remove the GPS check. I forget what the actual bit is for the GPS but because it's sitl you could just set ARMING_CHECK = 0 and it should work. You'll also need to set EKx_GPS_TYPE = 3. This should be more clear on the wiki so I've added a to-do to add it. ArduPilot/ardupilot_wiki#2909
    Randy Mackay
    @rmackay9
    @joekrom, I think you'll need to set the GPS_TYPE parameter as well to 14. If you're using a vicon system, remember that it's important that the compass direction agree with the North-South direction of the vicon.
    Joel Fankam
    @joekrom
    @rmackay9 is it possible to achieve indoor autonomous flight using GPS_INPUT and by sending the yaw angle ?
    @rmackay9 i am using pozyx
    Randy Mackay
    @rmackay9
    Yes, it probably is possible, I think you'll need to set the EKx_MAG_CAL parameter (weird name I know) to make it consume this yaw. and the compasses will need to be disabled (compas_use = 0, compass_use2=0, compass_use3=0)
    If you're using Pozyx we have another method though. https://ardupilot.org/copter/docs/common-pozyx.html. be sure to use EKF3 though
    Joel Fankam
    @joekrom
    ok thanks i will try it and let you know . thanks a a lot
    To use this method i need to adapt your script in python since i am using ardupilot. Is that a possibility i can use ?
    Randy Mackay
    @rmackay9
    Using the above method, you can align the beacon system's coordinates with the compass (i.e. leave the compass enabled) using the BCN_ORIENT_YAW parameter. Pozyx can't provide the vehicle's yaw, it can only provide the position, so the compasses need to be enabled.
    Joel Fankam
    @joekrom
    I am using rapsberry pi instead of arduiono