Tuesday, October 24, 2017

2.3 Blog: Unmanned Aerial Systems



Unmanned Aerial Systems (UAS) Lost Link Procedures

Unnamed Aerial Systems (UAS) that operate within the our National Airspace System (NAS) whether within Line of Sight (LOS) or Beyond Visual Line of Sight (BVLOS) must be equipped with the appropriate technologies to ensure a safe recovery of the aerial platform in the event of a lost data link between the operator and aerial platform. In accordance with Federal Aviation Regulations (FARs) Parts 91.3 and 91.13 General Operating & Flight Rules; the pilot in command of an aircraft is responsible for that aircraft’s operations and must ensure that the aircraft is not operated so that it causes undue harm to endanger a person or their property (Federal Aviation Administration, 2017a). In other words, even when things go wrong, the pilot in command is still responsible and accountable.

In addition, Part 107 Small Unmanned Aircraft Systems found in the FARs discuss in detail the operating rules for a remote pilot in command; these rules do not alleviate the pilot in command from their general operating responsibilities as outlined in Part 91. In the event, a small UAS (sUAS) operator loses the data link with their platform they are still responsible. Fortunately, unless otherwise authorized sUAS are operated within line of sight and under very strict rules which lessen the potential for damage in the event of a loss link scenario. Most small commercial UAS operating under Part 107 incorporate contingency lost link features such as safe modes and return to home modes (Stansbury, Tanis, & Wilson, 2009). When the sUAS detects a lost link, the platform will autonomously fly to the point of launch or a pre-programmed waypoint; they are also capable of auto-landing; two examples are the Piccolo and Procerus Kestrel autopilots (Stansbury et al., 2009).

UAS that fly out of the general scope of Part 107 based on a waiver and/or authorization will have published procedures in the event of a lost link between the Ground Control Station (GCS) and the air vehicle. During the certificate of waiver or authorization process, lost link procedures are addressed but will vary on the type of UAS (Federal Aviation Administration, 2017b). Letters of Agreement (LOA) between Air Traffic Control (ATC) and the UAS proponent will ensure a lost link contingency plan is in place and that lost link procedures will not interfere with other NAS traffic (Federal Aviation Administration, 2017b).

UAS flown by the military also have published loss link procedures. A good example can found in U. S. Army Fort Knox Regulation 95-23 Unmanned Aircraft System Flight Rules (unclassified). This regulation specifies the following:

Small UAS
·         UAS will have a pre-programmed lost link location and altitude.
·         The UAS will orbit until the link can be re-established or the aircraft runs out of fuel.
Large UAS
·         UAS will proceed at mission altitude to a pre-programmed lost link location, then spiral to 4300 feet msl.
·         The UAS will orbit at 4300 feet msl and attempts will be made to re-establish the link (United States Army, Fort Knox, 2016).

Conclusions
Success in the event of a lost link scenario is dependent upon two parts; the first is establishing lost link procedures and the protocols to re-establish the link and the second is the UAS architecture. The air vehicle must be able to autonomously recognize when command, control, and communications (C3) are lost, then independently carry out those tasks to re-establish C3, or safely independently recover. It is important that operators fully understand their equipment. sUAS hobbyist need to know the capabilities of their air vehicles and what their autonomous actions are in the event of a lost link. UAS operating in controlled airspace must comply with their waivers and authorization and strictly adhere to that documentation.
 
In 2011 The MITRE Corporation started working with the FAA to develop a UAS onboard Intelligent Analyzer that will detect loss link situations and convert data relating to the platform’s position, altitude, airspeed, and next waypoint into a synthesized voice message that could be broadcast over emergency frequencies to ATC and other aircraft (Van Cleave, 2011). Future success depends on the FAA working with UAS manufactures to create technologies that will mitigate accidents or damage in the event of lost C3 such as the Intelligent Analyzer.
 
References

Federal Aviation Administration. (2017a). Federal aviation regulations: Part 91. Washington, DC: U.S. Dept. of Transportation, Federal Aviation Administration.

Federal Aviation Administration. (2017b). Unmanned aircraft systems (UAS) (JO 7200.23A). Retrieved from the Federal Aviation Administration website: https://www.faa.gov/documentLibrary/media/Order/JO_7200.23A_Unmanned_Aircraft_Systems_(UAS).pdf

Stansbury, R. S., Tanis, W., & Wilson, T. A. (2009, April). A technology survey of emergency recovery and flight termination systems for UAS. Paper presented at AIAA InfoTech Aerospace Conference, Seattle, WA. Retrieved from http://commons.erau.edu/publication/73/

United States Army, Fort Knox. (2016). Fort Knox Regulation 95-23, Unmanned Aircraft System Flight Rules. Retrieved from Headquarters, Fort Knox website: http://www.knox.army.mil/garrison/dhr/asd/docs/regs/r95-23.pdf

Van Cleave, D. A. (2011, January). Keeping track of unmanned aircraft by overcoming "Lost Links". Retrieved from https://www.mitre.org/publications/project-stories/keeping-track-of-unmanned-aircraft-by-overcoming-lost-links






Thursday, October 19, 2017

1.5 Blog: UAS Strengths and Weaknesses



The military uses Unmanned Aerial Systems in a variety of applications of which one is reconnaissance. The same UAS reconnaissance systems used by the military can be effectively used by local government and civil organizations to fight illegal poaching operations throughout the world.

Traditionally the military uses UAS reconnaissance for “long range high altitude surveillance” and “close range reconnaissance” (Barnett, Bird, Culhane, Sharkasi, & Reinholtz, 2007). Of the two, “close range reconnaissance” is well suited for conducting operations in support of anti-poaching efforts by small groups of hunters. The same strategies used by the military to locate and identify potential human threats are applicable to organizations trying to locate and identify poachers in the wilderness.

UAS are categorized by size, weight, range, speed, and platform capabilities (Brown, n.d.). The RQ-11B man-portable small UAS (sUAS) is a perfect example of an unmanned aerial vehicle built for both military and commercial applications capable of gathering low-altitude intelligence and performing both reconnaissance and surveillance missions (AeroVironment, 2017). Key features include its low operating weight of 4.2 pounds, hand-launched capability, supports autonomous operations, ability to auto land, and ruggedized design for harsh environments (AeroVironment, 2017). This rapidly deployable sUAS can be outfitted with a forward electro-optical camera and an infrared camera for night operations; this coupled with over one hour of flight time is ideal for anti-poaching missions (AeroVironment, 2017).

However, as well equipped as the Raven is, it has one shortcoming; without the right computing algorithms, it will not locate poachers and capture the data needed to fight the illegal activities. This weakness has been overcome through the efforts of professors from the University of Maryland’s Advanced Computer Studies (UMIACS) (Chiaramonte, 2015). The university team created predictive analysis algorithms that analyze factors such as terrain data, wildlife patterns, and past poacher behaviors that significantly increase the chances of detecting poaching operations (Corrigan, 2017). The algorithm creates the flight path and syncs it to the platform’s onboard autopilot (Chiaramonte, 2015).

Another obstacle faced by civilian UAS operations is the lack of legislation to support their anti-poaching objectives. Many countries, especially those in South Africa have banned UAS operations altogether (Nuwer, 2017). The only effective way to create pro UAS legislation is through education that sells the benefits and advantages of UAS in anti-poaching efforts.

Final Thoughts
Technologies used in military UAS applications with slight modifications can translate into effective tools for civil non-profit and commercial enterprises. The same platform used for anti-poaching can be used for other conversation programs without modifications; it can detect, locate, and transmit data. UAS are an economical platform that will continue to develop and be used in applications not currently thought of. Their popularity has spawned an entire industry and commercial developers are soon to be on par if not as of yet with the military; just as commercial aircraft manufacturers were post World War II.

References

AeroVironment. (2017). UAS RQ-11B Raven. Retrieved from https://www.avinc.com/images/uploads/product_docs/Raven_Datasheet_2017_Web_v1.pdf

Barnett, S., Bird, J., Culhane, A., Sharkasi, A., & Reinholtz, C. (2007). Deployable reconnaissance from a VTOL UAS in urban environments. Paper presented at the, 6561(1) http://dx.doi.org/10.1117/12.718330

Brown, J. (n.d.). Types of military drones: The best technology available today [Web log post]. Retrieved from http://mydronelab.com/blog/types-of-military-drones.html

Chiaramonte, P. (2015, April 12). How drones are battling animal poachers in Africa. Retrieved from http://www.foxnews.com/tech/2015/04/09/drones-being-used-to-predict-and-prevent-animal-poaching-in-africa.html

Corrigan, F. (2017, January 29). 8 Top anti poaching drones for critical wildlife protection. Retrieved from https://www.dronezon.com/drones-for-good/wildlife-conservation-protection-using-anti-poaching-drones-technology/

Nuwer, R. (2017, March 13). High above, drones keep watchful eyes on wildlife in Africa. The New York Times. Retrieved from https://www.nytimes.com/2017/03/13/science/drones-africa-poachers-wildlife.html

Thursday, July 14, 2016

7.5 Blog Activity: Sense and Avoid Sensor Selection


Vision/Optical Sensing and Avoidance

Gabriel P. Riccio

7.4 Research Assignment: Sense and Avoid

UNSY 605 Unmanned Systems Sensing, Perception, and Processing

Embry-Riddle Aeronautical University-Worldwide
14Jul16


Small Unmanned Aerial Systems (sUAS) while being directly piloted by a human have a zero level of autonomy.  The pilot is responsible for all sensing and avoidance with other aircraft and obstacles.  Semi-autonomous and autonomous sUAS must have on-board sense and avoidance technologies to reduce the risk of unwanted collisions.  The Defense Advanced Research Projects Agency (DARPA) has developed a fully autonomous sUAS quadcopter that uses high definition cameras, LiDAR (Light Detection and Ranging), sonar, and internal measurement units for sensing and avoidance (Szondy, 2016).  This combination of sensors has proven very successful for DARPA with their testing of the quadcopter.  However, having multiple sensors may not be practical for other sUAS applications.  Skydio, a new startup company in California, is working diligently to improve sUAS sensing (Popper, 2015).  The company is working to develop technologies to make sUAS safer and improve autonomous flight capabilities (Popper, 2015).  Skydio engineers believe they can use standalone ordinary cameras without sonar or lasers to achieve effective sensing and avoidance (Popper, 2015).  The Phantom 4 semi-autonomous sUAS is equipped with vision sensors for sensing and avoidance (Bolton, 2016).  If the Phantom 4 flies within 50 feet of an obstacle it will begin to slow, it comes to a complete stop if it flies within 6 feet of an obstacle (Bolton, 2016).  Due to the success of the Phantom 4 and other similar sUAS, vision/optical sensors are an excellent sensor choice for obstacle avoidance.
            Faster and more powerful computers along with a newer set of algorithms enhance the effectiveness of UAS vision systems for sUAS (Barry, Oleynikova, Honegger, Pollefeys, & Tedrake, n.d).  Vision sensors have proved to be successful in autonomous flight from takeoff to landing while providing obstacle avoidance (Barry, et al. n.d.).  When sUAS are outfitted with stereo vision; individual 2-dimensional images are combined to create 3-dimensional images when appropriately referenced and processed (Barry, et al. n.d.).  Some notable concerns by designers while selecting vision system for their platform are; latency of the data stream, power consumption, and the synchronization of multiple image exposures (Barry, et al. n.d.).
            The DJI Phantom 4 has front obstacle sensors that work in conjunction with its computer vision and processing to react to and avoid obstacles in its path ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  In “Normal Mode” the platform will stop and hover if an obstacle is in its path, in other modes, it will alter its flight path to avoid the obstacle or come to a hover if need be ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  The optical sensing system has a 60 degree by 50 degree field of view that uses the data collected to create a 3-dimensional map for obstacle avoidance ("Inside a Drone: Computer Vision," 2016).  Additionally, it has dual cameras mounted on the bottom and dual ultrasonic sensors for position accuracy ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  The overall weight of the platform is 1380 grams, and has a top speed of 20 meters per second ("Phantom 4 - DJI’s smartest flying camera ever," 2016).  At a retail price of just under $1400 dollars, it is not cheap, but reasonable with all of its embedded autonomous capabilities (Popper, 2016).  Some important specifications on the Phantom 4 obstacle sensing system based on the DJI company product website are as follows:
·         Obstacle Sensory Range – 2 feet to 49 feet
·         Width of Optical Sensing System – 0.7meters to 15meters
·         Operating Environment – Surface with clear pattern and adequate lighting (lux>15)
·         Altitude and Operating Range of the Positioning System – 0 feet – 33 feet ("Phantom 4 - DJI’s smartest flying camera ever," 2016).
In conclusion, the DJI Phantom 4 represents how standalone vision/optical sensors coupled with fast computing power can be successfully engineered into sUAS for sensing and obstacle avoidance.  If the goal is full autonomy, then the platform will have to have multiple sensors such as DARPAs’ fully autonomous quadcopter.  The dollar cost for DARPASs’ quadcopter was not presented in any literature during the research for this paper, however, it can be reasonably hypothesized that DARPA has invested a considerable amount of money in their project.  For the retail consumer looking for a sUAS with semi-autonomous functionality, selecting one with vision/optical sensors is an excellent choice.
References
Barry, A., Oleynikova, H., Honegger, D., Pollefeys, M., & Tedrake, R. (n.d.). Fast onboard stereo for vision UAVs. Retrieved from http://groups.csail.mit.edu/robotics-center/public_papers/Barry15a.pdf
Bolton, D. (2016, March 2). DJI unveils the Phantom 4 semi-autonomous drone | News | Lifestyle | The Independent. Retrieved from http://www.independent.co.uk/life-style/gadgets-and-tech/news/dji-phantom-4-drone-price-buy-autonomous-tapfly-activetrack-a6908096.html
Inside a Drone: Computer Vision. (2016). Retrieved from http://www.dji.com/newsroom/news/inside-a-drone-computer-vision
Phantom 4 - DJI’s smartest flying camera ever. (2016). Retrieved from https://www.dji.com/product/phantom-4
Popper, B. (2015, January 15). A tiny startup has made big strides in creating self-navigating drones | The Verge. Retrieved from http://www.theverge.com/2015/1/15/7550669/skydio-drone-sense-and-avoid-camera-vision
Popper, B. (2016, March 1). DJI's revolutionary Phantom 4 drone can dodge obstacles and track humans | The Verge. Retrieved from http://www.theverge.com/2016/3/1/11134130/dji-phantom-4-drone-autonomous-avoidance-tracking-price-video
Szondy, D. (2016, February 12). DARPA's fully-loaded quadcopter autonomously navigates an indoor maze at 45 mph. Retrieved from http://www.gizmag.com/darpa-drone-autonomous-45-mph/41810/




Wednesday, July 6, 2016

6.4 Research Assignment Control Station Analysis


Black Knight Unmanned Vehicle Control Station
The Black Knight is an armored Unmanned Ground Combat Vehicle (UGCV) (Valois, Herman, Bares, & Rice, 2008).  This UGCV is capable of forward scouting, surveillance, target acquisition, and performing missions that are deemed too hazardous to military personnel (National Robotics Engineering Center, 2016).  The 12 ton Black Knight with its 300 horsepower diesel engine can achieve speeds up to 15 miles per hour (mph) while being teleoperated or in autonomous mode in off highway terrain (National Robotics Engineering Center, 2016).  The National Robotics Engineering Center (NREC) designed and engineered the control, teleoperation, perception, and on-board safety systems (National Robotics Engineering Center, 2016).  The platform is equipped with Light Detection and Ranging (LiDAR) technology, Forward Looking Infrared (FLIR), sophisticated stereo video cameras, Global Positioning Satellite (GPS) sensors, wireless data link, and sensors that support both semi-autonomous and autonomous operations (National Robotics Engineering Center, 2016).  The vehicle is controlled by an operator in another vehicle from the Robotic Operator Control Station (ROCS) or off-board with a safety controller (National Robotics Engineering Center, 2016).

Figure 1. Black Knight unmanned ground vehicle with 25-mm cannon. Adapted from “Black Knight prototype unmanned combat vehicle” by Miliary-Today.com (n.d.). Retrieved from http://www.military-today.com/apc/black_knight.htm.

As stated previously, the Black Knight is controlled by the ROCS, which is located with the operator in another independent armored vehicle (Valois, et al. 2008).  The ROCS is composed of a video monitor which displays synthesized views of the driving camera, the Operator Control Software (OCS), and the hand controller (Valois, et al. 2008).  The Operator Control Station (OCS) uses Microsoft Windows software which controls the Black Knight though the ROCS interface (Valois, et al. 2008).
Figure 2. Black Knight Robotic Operator Control Station. Adapted from “Remote operation of the black knight unmanned ground combat vehicle” by J. Valois, H. Herman, J. Bares, & D. Rice (2008).
The Autonomy, Perception, and Control Module (APCM), “contains all of the sensors, computers, power management, electronics, and networking equipment required to safely perform remote autonomous operations” (Valois, et al. 2008).  The APCM Planner module provides the information to the driver via the ROCS for orientation (Valois, et al. 2008).  The driver uses the hand controller to input driving commands to the Black Knight, and for camera control (Valois, et al. 2008).
Figure 3. Black Knight Modules and Operators. Adapted from “Remote operation of the black knight unmanned ground combat vehicle” by J. Valois, H. Herman, J. Bares, & D. Rice (2008).
The driver uses the joysticks on the hand controller during teleoperation for steering and speed control (Valois, et al. 2008).

In addition to the ROCS hand controller, the vehicle has a remote hand controller with a dedicated wireless connection to the Vehicle Controller Unit (VCU) (Valois, et al. 2008).  The hand controller is used by a safety officer, who is located in a separate vehicle (Valois, et al. 2008).  In the event that an unsafe condition occurs, the safety officer can issue a “stop” command (Valois, et al. 2008).  The hand controller can also be used for dismounted operations (Valois, et al. 2008).

The Black Knight suffers its own challenges; the most significant control station challenge are the operators’ themselves.  During operations, the vehicle takes a lot of abuse from the operators; since the drivers are not actually in the Black Knight, they tend to treat it more harshly and with more recklessness (Valois, et al. 2008).  I would recommend additional training for the drivers, to ensure they do not unnecessarily drive the vehicles to the breaking point.  The platform needs to be treated with the same care and concern as if the operators were actually inside the platform.
References:
Black Knight prototype combat vehicle. (n.d.). Retrieved from http://www.military-today.com/apc/black_knight.htm
National Robotics Engineering Center. (2016). Black Knight Overview. Retrieved from http://www.nrec.ri.cmu.edu/projects/black_knight/
Valois, J., Herman, H., Bares, J., & Rice, D. P. (2008). Remote operation of the black knight unmanned ground combat vehicle. Paper presented at the, 6962(1) doi:10.1117/12.782109