Thursday, July 14, 2016

7.5 Blog Activity: Sense and Avoid Sensor Selection


Vision/Optical Sensing and Avoidance

Gabriel P. Riccio

7.4 Research Assignment: Sense and Avoid

UNSY 605 Unmanned Systems Sensing, Perception, and Processing

Embry-Riddle Aeronautical University-Worldwide
14Jul16


Small Unmanned Aerial Systems (sUAS) while being directly piloted by a human have a zero level of autonomy.  The pilot is responsible for all sensing and avoidance with other aircraft and obstacles.  Semi-autonomous and autonomous sUAS must have on-board sense and avoidance technologies to reduce the risk of unwanted collisions.  The Defense Advanced Research Projects Agency (DARPA) has developed a fully autonomous sUAS quadcopter that uses high definition cameras, LiDAR (Light Detection and Ranging), sonar, and internal measurement units for sensing and avoidance (Szondy, 2016).  This combination of sensors has proven very successful for DARPA with their testing of the quadcopter.  However, having multiple sensors may not be practical for other sUAS applications.  Skydio, a new startup company in California, is working diligently to improve sUAS sensing (Popper, 2015).  The company is working to develop technologies to make sUAS safer and improve autonomous flight capabilities (Popper, 2015).  Skydio engineers believe they can use standalone ordinary cameras without sonar or lasers to achieve effective sensing and avoidance (Popper, 2015).  The Phantom 4 semi-autonomous sUAS is equipped with vision sensors for sensing and avoidance (Bolton, 2016).  If the Phantom 4 flies within 50 feet of an obstacle it will begin to slow, it comes to a complete stop if it flies within 6 feet of an obstacle (Bolton, 2016).  Due to the success of the Phantom 4 and other similar sUAS, vision/optical sensors are an excellent sensor choice for obstacle avoidance.
            Faster and more powerful computers along with a newer set of algorithms enhance the effectiveness of UAS vision systems for sUAS (Barry, Oleynikova, Honegger, Pollefeys, & Tedrake, n.d).  Vision sensors have proved to be successful in autonomous flight from takeoff to landing while providing obstacle avoidance (Barry, et al. n.d.).  When sUAS are outfitted with stereo vision; individual 2-dimensional images are combined to create 3-dimensional images when appropriately referenced and processed (Barry, et al. n.d.).  Some notable concerns by designers while selecting vision system for their platform are; latency of the data stream, power consumption, and the synchronization of multiple image exposures (Barry, et al. n.d.).
            The DJI Phantom 4 has front obstacle sensors that work in conjunction with its computer vision and processing to react to and avoid obstacles in its path ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  In “Normal Mode” the platform will stop and hover if an obstacle is in its path, in other modes, it will alter its flight path to avoid the obstacle or come to a hover if need be ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  The optical sensing system has a 60 degree by 50 degree field of view that uses the data collected to create a 3-dimensional map for obstacle avoidance ("Inside a Drone: Computer Vision," 2016).  Additionally, it has dual cameras mounted on the bottom and dual ultrasonic sensors for position accuracy ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  The overall weight of the platform is 1380 grams, and has a top speed of 20 meters per second ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  At a retail price of just under $1400 dollars, it is not cheap, but reasonable with all of its embedded autonomous capabilities (Popper, 2016).  Some important specifications on the Phantom 4 obstacle sensing system based on the DJI company product website are as follows:
·         Obstacle Sensory Range – 2 feet to 49 feet
·         Width of Optical Sensing System – 0.7meters to 15meters
·         Operating Environment – Surface with clear pattern and adequate lighting (lux>15)
·         Altitude and Operating Range of the Positioning System – 0 feet – 33 feet ("Phantom 4 - DJI’s smartest flying camera ever," 2016).
In conclusion, the DJI Phantom 4 represents how standalone vision/optical sensors coupled with fast computing power can be successfully engineered into sUAS for sensing and obstacle avoidance.  If the goal is full autonomy, then the platform will have to have multiple sensors such as DARPAs’ fully autonomous quadcopter.  The dollar cost for DARPASs’ quadcopter was not presented in any literature during the research for this paper, however, it can be reasonably hypothesized that DARPA has invested a considerable amount of money in their project.  For the retail consumer looking for a sUAS with semi-autonomous functionality, selecting one with vision/optical sensors is an excellent choice.
References
Barry, A., Oleynikova, H., Honegger, D., Pollefeys, M., & Tedrake, R. (n.d.). Fast onboard stereo for vision UAVs. Retrieved from http://groups.csail.mit.edu/robotics-center/public_papers/Barry15a.pdf
Bolton, D. (2016, March 2). DJI unveils the Phantom 4 semi-autonomous drone | News | Lifestyle | The Independent. Retrieved from http://www.independent.co.uk/life-style/gadgets-and-tech/news/dji-phantom-4-drone-price-buy-autonomous-tapfly-activetrack-a6908096.html
Inside a Drone: Computer Vision. (2016). Retrieved from http://www.dji.com/newsroom/news/inside-a-drone-computer-vision
Phantom 4 - DJI’s smartest flying camera ever. (2016). Retrieved from https://www.dji.com/product/phantom-4
Popper, B. (2015, January 15). A tiny startup has made big strides in creating self-navigating drones | The Verge. Retrieved from http://www.theverge.com/2015/1/15/7550669/skydio-drone-sense-and-avoid-camera-vision
Popper, B. (2016, March 1). DJI's revolutionary Phantom 4 drone can dodge obstacles and track humans | The Verge. Retrieved from http://www.theverge.com/2016/3/1/11134130/dji-phantom-4-drone-autonomous-avoidance-tracking-price-video
Szondy, D. (2016, February 12). DARPA's fully-loaded quadcopter autonomously navigates an indoor maze at 45 mph. Retrieved from http://www.gizmag.com/darpa-drone-autonomous-45-mph/41810/




Wednesday, July 6, 2016

6.4 Research Assignment Control Station Analysis


Black Knight Unmanned Vehicle Control Station
The Black Knight is an armored Unmanned Ground Combat Vehicle (UGCV) (Valois, Herman, Bares, & Rice, 2008).  This UGCV is capable of forward scouting, surveillance, target acquisition, and performing missions that are deemed too hazardous to military personnel (National Robotics Engineering Center, 2016).  The 12 ton Black Knight with its 300 horsepower diesel engine can achieve speeds up to 15 miles per hour (mph) while being teleoperated or in autonomous mode in off highway terrain (National Robotics Engineering Center, 2016).  The National Robotics Engineering Center (NREC) designed and engineered the control, teleoperation, perception, and on-board safety systems (National Robotics Engineering Center, 2016).  The platform is equipped with Light Detection and Ranging (LiDAR) technology, Forward Looking Infrared (FLIR), sophisticated stereo video cameras, Global Positioning Satellite (GPS) sensors, wireless data link, and sensors that support both semi-autonomous and autonomous operations (National Robotics Engineering Center, 2016).  The vehicle is controlled by an operator in another vehicle from the Robotic Operator Control Station (ROCS) or off-board with a safety controller (National Robotics Engineering Center, 2016).

Figure 1. Black Knight unmanned ground vehicle with 25-mm cannon. Adapted from “Black Knight prototype unmanned combat vehicle” by Miliary-Today.com (n.d.). Retrieved from http://www.military-today.com/apc/black_knight.htm.

As stated previously, the Black Knight is controlled by the ROCS, which is located with the operator in another independent armored vehicle (Valois, et al. 2008).  The ROCS is composed of a video monitor which displays synthesized views of the driving camera, the Operator Control Software (OCS), and the hand controller (Valois, et al. 2008).  The Operator Control Station (OCS) uses Microsoft Windows software which controls the Black Knight though the ROCS interface (Valois, et al. 2008).
Figure 2. Black Knight Robotic Operator Control Station. Adapted from “Remote operation of the black knight unmanned ground combat vehicle” by J. Valois, H. Herman, J. Bares, & D. Rice (2008).
The Autonomy, Perception, and Control Module (APCM), “contains all of the sensors, computers, power management, electronics, and networking equipment required to safely perform remote autonomous operations” (Valois, et al. 2008).  The APCM Planner module provides the information to the driver via the ROCS for orientation (Valois, et al. 2008).  The driver uses the hand controller to input driving commands to the Black Knight, and for camera control (Valois, et al. 2008).
Figure 3. Black Knight Modules and Operators. Adapted from “Remote operation of the black knight unmanned ground combat vehicle” by J. Valois, H. Herman, J. Bares, & D. Rice (2008).
The driver uses the joysticks on the hand controller during teleoperation for steering and speed control (Valois, et al. 2008).

In addition to the ROCS hand controller, the vehicle has a remote hand controller with a dedicated wireless connection to the Vehicle Controller Unit (VCU) (Valois, et al. 2008).  The hand controller is used by a safety officer, who is located in a separate vehicle (Valois, et al. 2008).  In the event that an unsafe condition occurs, the safety officer can issue a “stop” command (Valois, et al. 2008).  The hand controller can also be used for dismounted operations (Valois, et al. 2008).

The Black Knight suffers its own challenges; the most significant control station challenge are the operators’ themselves.  During operations, the vehicle takes a lot of abuse from the operators; since the drivers are not actually in the Black Knight, they tend to treat it more harshly and with more recklessness (Valois, et al. 2008).  I would recommend additional training for the drivers, to ensure they do not unnecessarily drive the vehicles to the breaking point.  The platform needs to be treated with the same care and concern as if the operators were actually inside the platform.
References:
Black Knight prototype combat vehicle. (n.d.). Retrieved from http://www.military-today.com/apc/black_knight.htm
National Robotics Engineering Center. (2016). Black Knight Overview. Retrieved from http://www.nrec.ri.cmu.edu/projects/black_knight/
Valois, J., Herman, H., Bares, J., & Rice, D. P. (2008). Remote operation of the black knight unmanned ground combat vehicle. Paper presented at the, 6962(1) doi:10.1117/12.782109