With regard to line of sight and commercial drone operations in the US, POTUS issued a memo for the FAA and related govt agencies to figure out how to safely integrate UAVs into the NAS (National Air Space). The most immediate result has been the FAA's issuance of Section 333 Exemptions.<p>Within the Section 333 Exemption there is language regarding being able to see the drones that specifically states: the UA must be operated within VSOL (visual line of flight) of the Pilot in Command and all operations must use a Visual Observer; the UA must remain within VSOL of the visual oberserver; and the VO and PIC must be able to communicate verbally at all times (precludes texting or e messaging) during the flight.<p>The FAA has gone even further by requiring PICs be actual FAA licensed pilots.<p>This language works well for most commercial uses (construction, agriculture and photography) but virtually kills Amazon's hopes for an unmanned delivery network.
I was prepared to be skeptical but this is actually pretty cool. The thing that's never discussed, however, is that these demonstrations <i>always</i> require fast, accurate, and external motion capture systems (Vicon, etc), and wouldn't work outdoors. I'm not aware of anything that is currently available that will get close to this level of speed and accuracy for vehicle state estimation, which will make it very difficult to apply these techniques in real world situations.<p>Very cool proof of concept though.
Want to see this in the wild it seams to me that doing it in a room with camera sensors are reducing the complexity to much.<p>Not to mention the usefulness.<p>The scenario i would like to see is: I am out hiking and want to get over a river, take my drone out of my backpack attach a rope and let it build a bridge.
Url changed from <a href="http://www.engadget.com/2015/09/19/watch-these-drones-build-a-rope-bridge/" rel="nofollow">http://www.engadget.com/2015/09/19/watch-these-drones-build-...</a>, which points to this.