Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Randy Mackay
    @rmackay9
    yes, seems like a good idea.. both can be used from within python scripts so I was tihnking some example scripts that show how to do that..
    or better yet some working code that does that, like precision landing on a target
    Silvio Revelli
    @silviorevelli
    perfect! I start thinking about a simple way to implement this. Thomas are you with us for PL?
    a7scenario
    @a7scenario
    @silviorevelli perhaps a RaspberryPi2 folder as well in the companion computer repository? It seems like it’s likely the most commonly used companion computer.
    ThomasSFL
    @ThomasSFL
    @a7scenario @silviorevelli @rmackay9 We have recently added Linux support for the IR-LOCK sensor, connected via USB. (http://bit.ly/1l7TntY) So one my current projects is to try python scripts for PL controls development (i.e., in Guided mode).... I think this is relevant to what you are proposing, but I am not 100% what you mean by "OpenCV-to-DroneAPI" bridge. Is this similar to what Daniel Nugent has already demonstrated? (http://bit.ly/1PCvLL2)
    a7scenario
    @a7scenario
    @ThomasSFL For you python module, I wonder it could just populate and output LANDING_TARGET messages that the 3.4 firmware is using for PL, as opposed to using Guided mode?
    Very cool re the linux support… is it limited to the sensor provided in the Mark One Kit, or older IR-Lock sensors as well?
    ThomasSFL
    @ThomasSFL
    @a7scenario Yes, it should work the same for all of the IR markers.... Also, the sensor is actually natively supported (i.e., simply plug it directly into Pixhawk), but I was planning to do some controls development/experimentation. For me, it is easier to prototype the controls in Python on the companion computer, as opposed to C++.
    a7scenario
    @a7scenario
    @ThomasSFL Using the IR-Lock sensor/beacon during Precision Land, is there any way for your Companion Computer or GCS to know that the quad, in fact, successfully landed on the target? I’m considering using that libpixyusb library for my companion computer to “check” whether the pixy can still see the beacon after land. Based on the narrow field of view of the Pixy, I think this would indicate that the quad landed on the beacon. Is there a better way of doing this?
    ThomasSFL
    @ThomasSFL
    @a7scenario That is a very good idea if you are developing a fully-automated system. There are probably multiple ways to accomplish this. (1) You could directly modify the control_land.cpp file in the APM code (w/ the sensor connected via I2C). Perhaps, the copter should ascend when the target is not detected for a period of time. (2) You can control the copter in GUIDED mode via the companion computer (using libpixyusb to connect the sensor).... If a new target position has not been detected for a period of time, the copter should probably be commanded to ascend. This would be useful if very high landing accuracy is absolutely required.
    Randy Mackay
    @rmackay9
    @a7scenario, I think a companion computer can check the HEARTBEAT message and once the vehicle has landed the system_status will become MAV_STATE_STANDBY. The only problem is that if the vehicle has landed because of a failsafe (like battery failsafe) then it's state is overwritten with MAV_STATE_CRITICAL or MAV_STATE_EMERGENCY.
    knowing whether the vehicle is landed or not is something that the 3DR solo guys have also asked for. I think the solution will be to remove the failsafe state reporting from the HEARTBEAT message and instead report this in a separate new mavlink message. Maybe called something like FAILSAFE_STATE which would allow simultaneous reporting of all failsafes.
    a7scenario
    @a7scenario
    @rmackay9 Thanks man (I had been using the rangefinder, but heartbeat is better). That said, I was actually referring to how I can tell if I landed on the precision landing target (not just that the aircraft is landed). I need some way for the companion computer to know whether it did, in fact, land on the target or not. Per above, I was thinking of simply continuing to calculate the LANDING_TARGET msgs post-landing. If the target is still in field-of-view post landing, you should be able to calculate if sitting on the target or not (and maybe even how off you were from target).
    Randy Mackay
    @rmackay9
    @a7scenario, yes that sounds like a good way to do it
    ThomasSFL
    @ThomasSFL
    @a7scenario Based the projects I have seen, if a small target is still in-view after landing, that can be considered a good landing. :smile: (unless the camera sensor is mounted very 'high' on the copter)
    Chambana
    @Chambana
    @rmackay9 I saw your discussion on google groups about incorporating a velocity-based controller into precision landing, instead of using a position based controller.
    Is that going to be added to master soon (or is it already in there?)
    ThomasSFL
    @ThomasSFL
    @Chambana Master currently includes the position controller. The position controller has undergone a lot of testing (by IR-LOCK users), but the velocity controller is promising, especially the 'tune-ability'.
    Silvio Revelli
    @silviorevelli
    @ThomasSFL finally! Great job :D
    'tune-ability'?
    Lloyd Ramey
    @lnr0626
    @ThomasSFL @a7scenario RE the velocity controller, it looks like it has been moved into master sometime around diydrones/ardupilot@bea6952 - am I misreading the commits/code?
    ThomasSFL
    @ThomasSFL
    @lnr0626 Well, those are some of the necessary parameters, but I am pretty sure the position controller is still being used in the actual controls code.
    Lloyd Ramey
    @lnr0626
    ah, makes sense
    ThomasSFL
    @ThomasSFL
    @lnr0626 I could be wrong. But I think I am correct (as of now).
    Randy Mackay
    @rmackay9
    @Chambana, that change is sitting in my precland12 branch. I'm not sure if I'll include it in Copter-3.4 or not. I need to be sure that it's an improvement. I haven't been able to finish that off because I'm working on some architectural changes with Leonard to make the motors library use -1 ~ +1 inputs instead of the current (slightly strange) -4500 ~ +4500 and 0 ~ 1000 ranges.
    ThomasSFL
    @ThomasSFL
    @Chambana Let me know if you try out Randy's precland12 branch. I played around with it a bit after making some modifications. I like how Randy arranged the code ...... Conceptually, it is a move in the right direction, but it hasn't undergone the same amount of testing as the existing position controller.
    Chambana
    @Chambana
    I'll give it a shot. I find that I'm getting the same level of precision with IR lock as I am with opencv derived landing_target messages, so I think I may be pushing the limit of the flight controller code in master.
    ThomasSFL
    @ThomasSFL
    @Chambana Thanks for the feedback..... After much testing, I think a critical issue is that the altitude and angle-to-target readings become less reliable at low-altitude... Lidar Lite has a minimum range of 20cm, and the camera sensor is not as useful when the target is at super-close range.
    a7scenario
    @a7scenario
    @ThomasSFL Another factor is the quality of your loiter. Per above, I’ve been flying with optical flow indoors. I changed the surface flooring to a more detailed/textured surface and it significantly improved my precision landing using IR Lock. I dont know if I’ll be able to replicate that level of precision outdoors using GPS though. At this point, I dont think the EKF will fuse optical flow with GPS.
    ThomasSFL
    @ThomasSFL
    @a7scenario Thanks for the update. :) Yes, you are exactly correct. The precision landing cannot perform any better that the loiter performance (using the existing control method) .... The loiter performance depends on the velocity measurements. So if your optical flow is providing better velocity readings than IMU+GPS, your results make perfect sense.
    @a7scenario However, one of my concerns is the robustness of the optical flow measurements. Have you had a good experience using optical flow? ... It is suppose to be able to work outdoors as well.
    a7scenario
    @a7scenario
    @ThomasSFL When using the PX4Flow indoors with a large amount of incandescent light, focused lens, and a highly textured floor, the optical flow provides a very stable loiter. With an untextured surface, it performs horribly. With a “semi textured” surface, it seems to have a similar loiter to what GPS can provide. It’s a hell of a lot cheaper than a motion capture system, but it’s (one of) the first things to impact the ground on a crash and I’ve had the sensors crack on me. Also, if you have anything dangling from your quad, using the px4flow will end up causing your quad to veer into the nearest wall. I really wish there was an off-the-shelf platform (like the solo) that had working optical flow , as px4flow takes some time to configure. I think the bebop 2 will eventually support optical flow on ardupilot, but not sure what the status is on that. Between the LIDAR lite, PX4Flow, and whatever you’re trying to accomplish on your companion computer, you’ll spend quite a bit of time “fiddling” with cables and settings trying to get your indoor platform back up and running after a crash.
    Randy Mackay
    @rmackay9
    That's right. Our EKF doesn't fuse GPS and optical flow.. it can fail over from GPS to optical flow but it doesn't use both at once.
    It would be possible of course, someone just needs to sweet talk Paul Riseborough
    ThomasSFL
    @ThomasSFL
    @a7scenario Thanks for the candid reply. :) I have a long list of IR-LOCK users working toward more reliable precision landing controls. The velocity estimation is a fundamental component of the controls performance, so Paul's work is very important. And it is interesting that you may have got better velocity estimations with optical flow ... However, I do a lot of outdoor development, so I probably can't rely on optical flow very much (unless it is automatically fused, as Randy mentioned).
    ThomasSFL
    @ThomasSFL
    @a7scenario ... also, if you are using optical flow indoors, the controls can behave much better due to the absence of wind.
    a7scenario
    @a7scenario
    @ThomasSFL Saw your post on diydrones. Awesome work on the “guaranteed" precision landing. Is the modified firmware available yet?
    ThomasSFL
    @ThomasSFL
    @a7scenario Thanks. The customized flight code used in the video is linked below. USER BEWARE: It is customized specifically for the particular test platform shown in the video.
    https://github.com/ThomasSFL/ardupilot/tree/Copter-3.3-Ref02
    Al Bee
    @abencomo
    We ordered a Volta 4Gmetry III a couple of weeks ago, and it arrived today. After unpacking it, I realized that we could just ordered the ODROID-XU4 and save some money. The Volta doesn't even include a power supply and the available documentation is really bad and out of date. You are pretty much paying over $200 more for just an USB chinese cable adapted with a DF13 connector to connect to the Pixhawk. Ouch! For security reasons here in the US, we cannot even use 4G for our project. Grazie Mile, Silvio!
    sinisterrook
    @sinisterrook
    Trying to find getting started guides. Or any sort of documentation. Any help?
    Silvio Revelli
    @silviorevelli
    docs.voltarobots.com
    mingf
    @mingf
    hmm, I can't seem to see docs.valtarobots.com:
    Error 1001 Ray ID: 34574a2853126bfe • 2017-03-26 04:15:35 UTC
    DNS resolution error
    Adrian
    @drumadrian
    Hello, Has anyone used the 4Gemetry with a Pixhawk 2 ? I would like to try it while also using the uAvionix Ping 2020 transceiver. Thanks
    Bill Piedra
    @billrock
    Hi Silvio. I have done it with both Pixhawk 2 AND Qualcomm Snapdragon Flight. I am presently a QUALCOMM DEVELOPER ADVOCATE for Qualcomm Flight, as well as other new products in the pipeline. We just did a 1000 flight test using 4G LTE Control. https://www.qualcomm.com/news/onq/2017/05/03/qualcomm-technologies-releases-lte-drone-trial-results I've sort of become an expert in the subject. If you need any more information please ask and I can provide you with a copy of your test result report when it is published.
    Adrian
    @drumadrian
    @billrock that is really cool to hear about. Can I contact you for more information? My email address is adrian.drummond@gmail.com
    Bill Piedra
    @billrock
    Sure, you can reach me at billrockus@gmail.com
    Bill Piedra
    @billrock
    @drumadrian I just sent you an email.
    Adrian
    @drumadrian
    @billrock Thank you :smile:
    Andres Rengifo
    @andresR8
    Hi, I'm considering to buy 4Gmetry to connect remotely to my rover. My question is Has 4Gmetry any restrictions about internet dataplans? I would like to use a Tmobile unlimited dataplan for smartphones.
    Adrian
    @drumadrian
    I'm not sure if this chat group is very active. It may take a little while to get your answer.