Thursday, July 14, 2016

7.5 Blog Activity: Sense and Avoid Sensor Selection


Vision/Optical Sensing and Avoidance

Gabriel P. Riccio

7.4 Research Assignment: Sense and Avoid

UNSY 605 Unmanned Systems Sensing, Perception, and Processing

Embry-Riddle Aeronautical University-Worldwide
14Jul16


Small Unmanned Aerial Systems (sUAS) while being directly piloted by a human have a zero level of autonomy.  The pilot is responsible for all sensing and avoidance with other aircraft and obstacles.  Semi-autonomous and autonomous sUAS must have on-board sense and avoidance technologies to reduce the risk of unwanted collisions.  The Defense Advanced Research Projects Agency (DARPA) has developed a fully autonomous sUAS quadcopter that uses high definition cameras, LiDAR (Light Detection and Ranging), sonar, and internal measurement units for sensing and avoidance (Szondy, 2016).  This combination of sensors has proven very successful for DARPA with their testing of the quadcopter.  However, having multiple sensors may not be practical for other sUAS applications.  Skydio, a new startup company in California, is working diligently to improve sUAS sensing (Popper, 2015).  The company is working to develop technologies to make sUAS safer and improve autonomous flight capabilities (Popper, 2015).  Skydio engineers believe they can use standalone ordinary cameras without sonar or lasers to achieve effective sensing and avoidance (Popper, 2015).  The Phantom 4 semi-autonomous sUAS is equipped with vision sensors for sensing and avoidance (Bolton, 2016).  If the Phantom 4 flies within 50 feet of an obstacle it will begin to slow, it comes to a complete stop if it flies within 6 feet of an obstacle (Bolton, 2016).  Due to the success of the Phantom 4 and other similar sUAS, vision/optical sensors are an excellent sensor choice for obstacle avoidance.
            Faster and more powerful computers along with a newer set of algorithms enhance the effectiveness of UAS vision systems for sUAS (Barry, Oleynikova, Honegger, Pollefeys, & Tedrake, n.d).  Vision sensors have proved to be successful in autonomous flight from takeoff to landing while providing obstacle avoidance (Barry, et al. n.d.).  When sUAS are outfitted with stereo vision; individual 2-dimensional images are combined to create 3-dimensional images when appropriately referenced and processed (Barry, et al. n.d.).  Some notable concerns by designers while selecting vision system for their platform are; latency of the data stream, power consumption, and the synchronization of multiple image exposures (Barry, et al. n.d.).
            The DJI Phantom 4 has front obstacle sensors that work in conjunction with its computer vision and processing to react to and avoid obstacles in its path ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  In “Normal Mode” the platform will stop and hover if an obstacle is in its path, in other modes, it will alter its flight path to avoid the obstacle or come to a hover if need be ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  The optical sensing system has a 60 degree by 50 degree field of view that uses the data collected to create a 3-dimensional map for obstacle avoidance ("Inside a Drone: Computer Vision," 2016).  Additionally, it has dual cameras mounted on the bottom and dual ultrasonic sensors for position accuracy ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  The overall weight of the platform is 1380 grams, and has a top speed of 20 meters per second ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  At a retail price of just under $1400 dollars, it is not cheap, but reasonable with all of its embedded autonomous capabilities (Popper, 2016).  Some important specifications on the Phantom 4 obstacle sensing system based on the DJI company product website are as follows:
·         Obstacle Sensory Range – 2 feet to 49 feet
·         Width of Optical Sensing System – 0.7meters to 15meters
·         Operating Environment – Surface with clear pattern and adequate lighting (lux>15)
·         Altitude and Operating Range of the Positioning System – 0 feet – 33 feet ("Phantom 4 - DJI’s smartest flying camera ever," 2016).
In conclusion, the DJI Phantom 4 represents how standalone vision/optical sensors coupled with fast computing power can be successfully engineered into sUAS for sensing and obstacle avoidance.  If the goal is full autonomy, then the platform will have to have multiple sensors such as DARPAs’ fully autonomous quadcopter.  The dollar cost for DARPASs’ quadcopter was not presented in any literature during the research for this paper, however, it can be reasonably hypothesized that DARPA has invested a considerable amount of money in their project.  For the retail consumer looking for a sUAS with semi-autonomous functionality, selecting one with vision/optical sensors is an excellent choice.
References
Barry, A., Oleynikova, H., Honegger, D., Pollefeys, M., & Tedrake, R. (n.d.). Fast onboard stereo for vision UAVs. Retrieved from http://groups.csail.mit.edu/robotics-center/public_papers/Barry15a.pdf
Bolton, D. (2016, March 2). DJI unveils the Phantom 4 semi-autonomous drone | News | Lifestyle | The Independent. Retrieved from http://www.independent.co.uk/life-style/gadgets-and-tech/news/dji-phantom-4-drone-price-buy-autonomous-tapfly-activetrack-a6908096.html
Inside a Drone: Computer Vision. (2016). Retrieved from http://www.dji.com/newsroom/news/inside-a-drone-computer-vision
Phantom 4 - DJI’s smartest flying camera ever. (2016). Retrieved from https://www.dji.com/product/phantom-4
Popper, B. (2015, January 15). A tiny startup has made big strides in creating self-navigating drones | The Verge. Retrieved from http://www.theverge.com/2015/1/15/7550669/skydio-drone-sense-and-avoid-camera-vision
Popper, B. (2016, March 1). DJI's revolutionary Phantom 4 drone can dodge obstacles and track humans | The Verge. Retrieved from http://www.theverge.com/2016/3/1/11134130/dji-phantom-4-drone-autonomous-avoidance-tracking-price-video
Szondy, D. (2016, February 12). DARPA's fully-loaded quadcopter autonomously navigates an indoor maze at 45 mph. Retrieved from http://www.gizmag.com/darpa-drone-autonomous-45-mph/41810/




Wednesday, July 6, 2016

6.4 Research Assignment Control Station Analysis


Black Knight Unmanned Vehicle Control Station
The Black Knight is an armored Unmanned Ground Combat Vehicle (UGCV) (Valois, Herman, Bares, & Rice, 2008).  This UGCV is capable of forward scouting, surveillance, target acquisition, and performing missions that are deemed too hazardous to military personnel (National Robotics Engineering Center, 2016).  The 12 ton Black Knight with its 300 horsepower diesel engine can achieve speeds up to 15 miles per hour (mph) while being teleoperated or in autonomous mode in off highway terrain (National Robotics Engineering Center, 2016).  The National Robotics Engineering Center (NREC) designed and engineered the control, teleoperation, perception, and on-board safety systems (National Robotics Engineering Center, 2016).  The platform is equipped with Light Detection and Ranging (LiDAR) technology, Forward Looking Infrared (FLIR), sophisticated stereo video cameras, Global Positioning Satellite (GPS) sensors, wireless data link, and sensors that support both semi-autonomous and autonomous operations (National Robotics Engineering Center, 2016).  The vehicle is controlled by an operator in another vehicle from the Robotic Operator Control Station (ROCS) or off-board with a safety controller (National Robotics Engineering Center, 2016).

Figure 1. Black Knight unmanned ground vehicle with 25-mm cannon. Adapted from “Black Knight prototype unmanned combat vehicle” by Miliary-Today.com (n.d.). Retrieved from http://www.military-today.com/apc/black_knight.htm.

As stated previously, the Black Knight is controlled by the ROCS, which is located with the operator in another independent armored vehicle (Valois, et al. 2008).  The ROCS is composed of a video monitor which displays synthesized views of the driving camera, the Operator Control Software (OCS), and the hand controller (Valois, et al. 2008).  The Operator Control Station (OCS) uses Microsoft Windows software which controls the Black Knight though the ROCS interface (Valois, et al. 2008).
Figure 2. Black Knight Robotic Operator Control Station. Adapted from “Remote operation of the black knight unmanned ground combat vehicle” by J. Valois, H. Herman, J. Bares, & D. Rice (2008).
The Autonomy, Perception, and Control Module (APCM), “contains all of the sensors, computers, power management, electronics, and networking equipment required to safely perform remote autonomous operations” (Valois, et al. 2008).  The APCM Planner module provides the information to the driver via the ROCS for orientation (Valois, et al. 2008).  The driver uses the hand controller to input driving commands to the Black Knight, and for camera control (Valois, et al. 2008).
Figure 3. Black Knight Modules and Operators. Adapted from “Remote operation of the black knight unmanned ground combat vehicle” by J. Valois, H. Herman, J. Bares, & D. Rice (2008).
The driver uses the joysticks on the hand controller during teleoperation for steering and speed control (Valois, et al. 2008).

In addition to the ROCS hand controller, the vehicle has a remote hand controller with a dedicated wireless connection to the Vehicle Controller Unit (VCU) (Valois, et al. 2008).  The hand controller is used by a safety officer, who is located in a separate vehicle (Valois, et al. 2008).  In the event that an unsafe condition occurs, the safety officer can issue a “stop” command (Valois, et al. 2008).  The hand controller can also be used for dismounted operations (Valois, et al. 2008).

The Black Knight suffers its own challenges; the most significant control station challenge are the operators’ themselves.  During operations, the vehicle takes a lot of abuse from the operators; since the drivers are not actually in the Black Knight, they tend to treat it more harshly and with more recklessness (Valois, et al. 2008).  I would recommend additional training for the drivers, to ensure they do not unnecessarily drive the vehicles to the breaking point.  The platform needs to be treated with the same care and concern as if the operators were actually inside the platform.
References:
Black Knight prototype combat vehicle. (n.d.). Retrieved from http://www.military-today.com/apc/black_knight.htm
National Robotics Engineering Center. (2016). Black Knight Overview. Retrieved from http://www.nrec.ri.cmu.edu/projects/black_knight/
Valois, J., Herman, H., Bares, J., & Rice, D. P. (2008). Remote operation of the black knight unmanned ground combat vehicle. Paper presented at the, 6962(1) doi:10.1117/12.782109







































           


















Friday, June 24, 2016

4.6 Blog: Unmanned System Data Protocol and Format


RQ-4 Global Hawk

Gabriel P. Riccio

4.5 Research Assignment Unmanned System Data Protocol and Format

UNSY 605 Unmanned Systems Sensing, Perception, and Processing

Embry-Riddle Aeronautical University-Worldwide


      The RQ-4 Global Hawk is a highly sophisticated unmanned aerial system (UAS) capable of performing high altitude, long endurance (HALE) aerial surveillance and reconnaissance over large geographical areas for the purpose of providing data to battlefield commanders ("RQ-40 Block 40 Global Hawk," 2012).  Outfitted with air to surface radar, the Global Hawk can monitor both fixed and moving targets in all weather conditions (“RQ-40 Block 40 Global Hawk," 2012).  The platform grew out of a 1990s DARPA (Defense Advanced Research Projects Agency) program ("Northrop Grumman RQ-4 Global Hawk - Copybook," n.d.).  The first Global Hawk produced by Northrop Grumman was the RQ-4A Block 10; the most current version is Block 40 ("Northrop Grumman RQ-4 Global Hawk - Copybook," n.d.).  A breakdown of the nomenclature RQ-4 is as follows: the “R” means reconnaissance, the “Q” means unmanned, and the digit “4” represents the fourth type ("Northrop Grumman RQ-4 Global Hawk - Copybook," n.d.).
       The original RQ-4A was equipped with electro-optical and infrared sensors along with synthetic aperture radar (SAR) ("Northrop Grumman RQ-4 Global Hawk - Copybook," n.d.). Later models of the Global Hawk came with imagery intelligence sensors, airborne signal intelligence payload sensors, the multi-platform radar technology insertion program system, and active electronically scanned array radar  ("Northrop Grumman RQ-4 Global Hawk - Copybook," n.d.).  It is important to highlight that the Global Hawk is a complete system.  In addition to the aircraft and payloads/sensors, it establishes data links for data download; it is controlled by the ground stations, and requires a lot of logistical support (Kinzig, 2010).  The on-board communication system enables command and control of the platform, its payload, and the ability to transfer data (Kinzig, 2010).  The data can be disseminated by ultra-high frequency line of sight (UHF LOS), common data link line of site (CDL LOS), Ku band satellite communications, UHF satellite communications, and other satellite communications (Kinzig, 2010).
Figure 1. The many methods of disseminating Global Hawk data. Adapted from “Global Hawk systems engineering case study” by B. Kinzig (2010). Retrieved from www.dtic.mil/dtic/tr/fulltext/u2/a538761.pdf
As stated previously, the biggest evolution in the platform has been the upgrades to the
sensor packages.  The imagery intelligence payload which consists of the electro-optical and
infrared sensor (EO/IR), and synthetic aperture radar (SAR) collect high resolution imagery for the purpose of intelligence gathering (RQ-4B Global Hawk block 30 operational test and evaluation report, 2011).  Radio frequency signals are collected by the signal intelligence payload whereupon they are processed to support intelligence operations; additionally they are capable of automatic signal detection, the location of the signal, signal direction, and signal identification (RQ-4B Global Hawk block 30 operational test and evaluation report, 2011). Refer to Figure 1 and Figure 2 to for sensor summary data.  The data formats are NTIF (National Imagery Format) standard 2.1 meaning the data format complies with prescribed military standards which increase capability and flexibility as compared to previous formats ("National Imagery Transmission Format Standard (NITFS)”, n.d.).  NITF 2.1 does have backward compatibility with earlier formats; it includes JPEG (Joint Photographic Experts Group) compression, newer decompression algorithms, and CGM (Computer Graphics Metafile) for graphics (“National Imagery Transmission Format Standard (NITFS)”, (n.d.).
Figure 2. Global Hawk SAR sensor summary data. Adapted from “Global Hawk program overview”, (2011). Retrieved from https://www.faa.gov/about/office_org/headquarters_offices/avs/offices/aam/cami/library/online_libraries/aerospace_medicine/sd/media/GH_Program_Overview_Briefing.pdf
Figure 3. Global Hawk EO/IR sensory summary data. Adapted from “Global Hawk program overview”, (2011). Retrieved from https://www.faa.gov/about/office_org/headquarters_offices/avs/offices/aam/cami/library/online_libraries/aerospace_medicine/sd/media/GH_Program_Overview_Briefing.pdf
       The Global Hawk will most likely continue to upgrade its sensor suite as technology improves, and or the mission changes. As of Block 30, there are no means for the Global Hawk to conduct autonomous operations and record data if data links are lost. During this research, it was discovered that it was recommended in the “RQ-4B Global Hawk block 30 operational test and evaluation report”, (2011) that data recording be implemented for “off-tether” missions.  All research on Block 40 indicates that data recording functionality has not been implemented as of yet.  Therefore, it is recommended that this functionality be added to meet operational needs in the event of a communications failure from the UAS to the control station. Additionally, during this research, power required for the sensor payload suites were not listed; however the platform has an on-board electric generator which supplies 25 kilo-volt-amperes to the platforms AC (alternating current) electrical system (Kinzig, 2010).

References
Kinzig, B. (2010). Global Hawk systems engineering case study.  Air Force Center for Systems Engineering, Air Force Institute of Technology, Wright Patterson, OH. Retrieved from www.dtic.mil/dtic/tr/fulltext/u2/a538761.pdf
National Imagery Transmission Format Standard (NITFS). (n.d.). Retrieved from http://www.globalsecurity.org/intell/systems/nitfs.htm
Northrop Grumman RQ-4 Global Hawk - Copybook. (n.d.). Retrieved from http://www.copybook.com/military/fact-files/northrop-grumman-rq-4-global-hawk
RQ-4B Global Hawk block 30 operational test and evaluation report. (2011). Retrieved from http://pogoarchives.org/m/ns/pentagon-ot-and-e-eval-rq-4b-global-hawk-20110526.pdf












Thursday, June 16, 2016

UAS Sensor Placement


Unmanned Aerial Systems (UAS) have been growing in popularity for several years for both professionals and hobbyists.   There are many UAS purchase choices available that can be used in a variety of applications.  This research assignment will focus on two such UAS; the first is the Yuneec Typhoon Q500 4K, capable of full motion video and still picture photography, and the second UAS is the Vortex 250 Pro, predominately used in first person view (FPV) racing; along with their associated sensor placement.

            Sensors on UAS need to be strategically placed to maximize their effectiveness and protect them from unforeseen hazards.  It is important for companies to develop sensor placement strategies for their products.  A good sensor placement strategy can improve the overall efficiency of a sensor such as in global positioning navigation (Vitus & Tomlin, 2010).  It has also been shown that an efficient strategic sensor placement plan can increase software algorithm performance (Vitus & Tomlin, 2010).

            The Typhoon Q500 4K manufactured by Yuneec Electric Aviation is a market competitive quadcopter similar in specification to the popular DJI Phantom 3 (Estes, 2015).  When selecting a UAS for the purpose of high quality image collection, it is important to select a platform with high quality sensors (“UAV sensors, n.d.).  An understanding of the light spectrum will help potential buyers pick optic sensors that meet their desired needs (“UAV sensors”, n.d.).  Visible light sensors, near infrared sensors, and infrared sensors are each capable of capturing a specific light band that falls within the electromagnetic spectrum (“UAV sensors”, n.d.).  Therefore, it is important to purchase the right sensor for the intended application and ensure its placement on the UAS will produce effective results.  The Q500 4K utilizes a visible light sensor that can be used across a variety of applications such in agriculture, surveying, forestry, and surveillance (“UAV sensors”, n.d.).  The stabilized camera is capable of 1080p adjustable high definition imagery with slow motion capability, and can take 12-megapixel still pictures (Estes, 2015).  The 1080p, 4K camera is mounted to the CG03 gimbal that yields a 130-degree field of view due to its placement below the main body (Amato, 2015).  The camera is modular, meaning it can be detached from the UAS for future sensor and payload upgrades (Amato, 2015).
Figure 1.  Picture of the Typhoon Q500 4k quadcopter. Adapted from “Yuneec announces new world class drone” by S. Patel (2015). Retrieved from http://www.guysgab.com/yuneec-announces-new-world-class-drone/

            Another vital sensor installed on the Typhoon Q500 is the global positioning system (GPS) receiver.  This sensor enables the pilot to easily fly and control the UAS.  The Q500 can be flown if GPS coverage is lost, however, it is more difficult to control.  GPS also creates a geo-fence (virtual barrier), that will keep the Q500 within a 300 foot perimeter of the operator if selected; additionally, if for some reason GPS connectivity is lost, the Q500 will automatically fly back to the pilot’s location ("Typhoon Q500 4K instructional manual," n.d.).  GPS sensor placement is important to ensure reliable signal reception.  It is important to remember, that GPS signal coverage may be lost if the quadcopter is flown indoors.  If the GPS receiver is connected to at least seven satellites, the “Follow Me” mode can be selected which will enable the Q500 to follow the pilot at selected altitude.

            When discussing first person view (FPV) UAS racing, the Vortex 250Pro is a competitive and reasonable purchase.  FPV UAS racing, also known as quadcopter racing is somewhat of a new phenomenon growing in popularity (Anthony, 2016).  During FPV racing, the pilot wears a pair of goggles that receives a video feed from the FPV UAS, so the racer can be controlled (Anthony, 2016).
Figure 2.  Image of the Vortex 250 Pro. Adapted from “Immersion Vortex 250 Pro FPV Quadcopter” (2015). Retrieved from http://www.dronetrest.com/t/immersionrc-vortex-250-pro-fpv-quadcopter/1418

The Vortex 250 Pro camera mount can support a flight cam or a high definition camera ("ImmersionRC Vortex 250 Pro FPV quadcopter - Product - DroneTrest," 2015).  It is important that the camera be mounted in a position so it can provide the pilot with needed perspective to successfully navigate the FPV UAS racecourse, which could include obstacles.  The camera is protected by its placement in the UAS from impacts and is supported by vibration dampened carbon fiber plate ("ImmersionRC Vortex 250 Pro FPV quadcopter - Product - DroneTrest," 2015).

            Most FPV UAS racers, when purchased, do not include the goggles.  The goggles are not required to fly the drone, but if they will enhance the experience, and if the UAS is used for racing, based on my research, the goggles are a requirement to be competitive.  The goggles are what the pilot wears, to allow them to optically fly the platform.  It is recommended that the optics are glass and have a digital head tracking technology, along with a field of view between 25% - 45% ("The ultimate FPV system guide 2016 - Best drone goggles", 2016).  The camera on the UAS racer transmits a feed to the receiver, which in turn transmits the feed to a display ("The ultimate FPV system guide - Everything explained - DroneUplift," n.d.).
Figure 3. The basic setup of FPV UAS system. Adapted from “The ultimate FPV system guide - Everything explained - DroneUplift," (n.d.). Retrieved from http://www.droneuplift.com/the-ultimate-fpv-system-guide-everything-explained/

            The Vortex 250 Pro also has an integrated full-graphic on screen display, an on board black box that collects and records flight data for tuning purposes, 40 channel NexWaveRF video, and seven 32bit dedicated ARM (advanced RISC machine) processors ("ImmersionRC Vortex 250 Pro FPV quadcopter - Product - DroneTrest," 2015).

            Whether the UAS is used for aerial photography or for FPV racing, sensor placement is important.  If UAS sensors are not strategically placed they will not achieve peak performance, nor will they be protected from an unwanted crash. 



References

Amato, A. (2015, April 2). Yuneec Q500 Typhoon review - DRONELIFE. Retrieved from http://dronelife.com/2015/04/02/dronelife-reviews-the-yuneec-q500-typhoon/

Anthony, S. (2016, January 28). First-person drone racing is much harder than I expected | Ars Technica. Retrieved from http://arstechnica.com/gadgets/2016/01/first-person-drone-racing-is-much-harder-than-i-expected/

Estes, A. (2015, September 28). Yuneec Typhoon Q500 4K review: This is my new favorite drone. Retrieved from http://gizmodo.com/yuneec-typhoon-q500-4k-review-this-is-my-new-favorite-1731109743

ImmersionRC Vortex 250 Pro FPV quadcopter - Product - DroneTrest. (2015, November). Retrieved from http://www.dronetrest.com/t/immersionrc-vortex-250-pro-fpv-quadcopter/1418

Patel, S. (2015, July 14). Yuneec announces new world class drone - Guys Gab. Retrieved from http://www.guysgab.com/yuneec-announces-new-world-class-drone/

The ultimate FPV system guide - Everything explained - DroneUplift. (n.d.). Retrieved from http://www.droneuplift.com/the-ultimate-fpv-system-guide-everything-explained/

The ultimate FPV system guide 2016 - Best drone goggles. (2016). Retrieved from http://www.dronethusiast.com/the-ultimate-fpv-system-guide/

Typhoon Q500 4K instructional manual. (n.d.). Retrieved from https://www.wellbots.com/content/Yuneec/q500_4k_user_manual.pdf

UAV sensors. (n.d.). Retrieved from http://www.questuav.com/news/uav-sensors

Vitus, M., & Tomlin, C. (2010). Sensor placement for improved robotic navigation. Retrieved from http://www.roboticsproceedings.org/rss06/p28.pdf