drone net: information fusion networks for reliable...

13
1 Drone Net: Information Fusion Networks for Reliable Localization ICARUS Research Group: ERAU Prescott The ERAU ICARUS Research Group proposes to develop an open architecture, Drone-Net, to detect, classify, and identify drones in sensitive environments. The network based nature of the proposed technology naturally leads to cooperation with multiple, geographically disparate campuses such ERAU Daytona Beach, University of Colorado Boulder, and the University of Alaska Fairbanks. By the end of this proposed effort the team expects to deliver a Technology Readiness Level (TRL) four (4) device that would facilitate the pursuit of external funding. The top level goal of this proposed research is to develop a real-time drone detection system to enhance the safety of the National Air Space (NAS). Additional goals include the development open architecture, which is sensor and algorithm agnostic. The reference design will support replication at low cost to encourage and facilitate deployment in academic environments. This proposed effort will provide students with research opportunities in the areas of heterogeneous information fusion and algorithm development for multi-sensor drone detection, classification, and identification. An education outreach goal will be to conduct testing at multiple campuses to provide independent verification. Finally, the critical findings and recommendations will be made available to federal agencies, published at peer reviewed conferences, and shared with collaborators. The principle objectives of this research effort are listed in Table 1 and are decomposed in the domains of research, education, and industry. Table 1: Objectives Domain Objective Research Develop procedures to allow other researchers to test a variety of sensors and algorithms. Include in the open architecture acoustic, ADS-B, primary/secondary RADAR, and LIDAR. Analyze the limits and capabilities of spatial, temporal, and spectral resolution for virtual boresight instrumentation. Involve graduate students in the development of information fusion algorithm. Evolve from information fusion with passive EO/IR to a passive/active sensor suite. Education Integrate ADS-B on existing drones. Provide undergraduate students with an opportunity to employ the Drone-net to explore sensor interfacing, data collection, and processing. Industry Enhance passive methods for civil aviation detection of drones not using ADS-B. Share data collected and implemented algorithms in a distributed database of observed compliant, compliant-low-quality, non-compliant and perhaps hostile drones. Many studies have indicated that multi-spectral EO/IR detection is quite effective. This methodology has been adopted as initial stage for our research effort and successfully tested using an embryonic drone detector placed on roof-tops at ERAU Prescott. On the other hand, no single sensing modality will suffice to reliably detect and localize a wide variety of drones, as substantiated by multiple studies. To this end, the ICARUS group proposes to pursue a heterogeneous information fusion approach with passive EO/IR and progressing to a “richer” passive/active sensor suite. Prior ICARUS research partially funded by DHS ADAC (Department of Homeland Security, Arctic Domain Awareness Center of Excellence) led to development of the SMDSI (Software Defined Multi-Spectral Imager) to detect and track marine traffic. This existing hardware will be adapted to accommodate additional sensors including acoustic, ADS-B (Automatic Dependent Surveillance – Broadcast), primary/secondary RADAR, and LIDAR in order to accelerate the development of optimal methods of drone detection, classification, and identification. The overall vision is to create a network of passive/active drone detection, classification, and identification nodes to enhance security and safety for drone operations that surpasses ADS-B and registration alone.

Upload: others

Post on 18-Jan-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

1

Drone Net: Information Fusion Networks for Reliable Localization ICARUS Research Group: ERAU Prescott

The ERAU ICARUS Research Group proposes to develop an open architecture, Drone-Net, to detect, classify, and identify drones in sensitive environments. The network based nature of the proposed technology naturally leads to cooperation with multiple, geographically disparate campuses such ERAU Daytona Beach, University of Colorado Boulder, and the University of Alaska Fairbanks. By the end of this proposed effort the team expects to deliver a Technology Readiness Level (TRL) four (4) device that would facilitate the pursuit of external funding.

The top level goal of this proposed research is to develop a real-time drone detection system to enhance the safety of the National Air Space (NAS). Additional goals include the development open architecture, which is sensor and algorithm agnostic. The reference design will support replication at low cost to encourage and facilitate deployment in academic environments. This proposed effort will provide students with research opportunities in the areas of heterogeneous information fusion and algorithm development for multi-sensor drone detection, classification, and identification. An education outreach goal will be to conduct testing at multiple campuses to provide independent verification. Finally, the critical findings and recommendations will be made available to federal agencies, published at peer reviewed conferences, and shared with collaborators.

The principle objectives of this research effort are listed in Table 1 and are decomposed in the domains of research, education, and industry.

Table 1: Objectives

Domain Objective Research • Develop procedures to allow other researchers to test a variety of sensors and algorithms.

• Include in the open architecture acoustic, ADS-B, primary/secondary RADAR, and LIDAR. • Analyze the limits and capabilities of spatial, temporal, and spectral resolution for virtual

boresight instrumentation. • Involve graduate students in the development of information fusion algorithm. • Evolve from information fusion with passive EO/IR to a passive/active sensor suite.

Education • Integrate ADS-B on existing drones. • Provide undergraduate students with an opportunity to employ the Drone-net to explore

sensor interfacing, data collection, and processing. Industry • Enhance passive methods for civil aviation detection of drones not using ADS-B.

• Share data collected and implemented algorithms in a distributed database of observed compliant, compliant-low-quality, non-compliant and perhaps hostile drones.

Many studies have indicated that multi-spectral EO/IR detection is quite effective. This methodology has been adopted as initial stage for our research effort and successfully tested using an embryonic drone detector placed on roof-tops at ERAU Prescott.

On the other hand, no single sensing modality will suffice to reliably detect and localize a wide variety of drones, as substantiated by multiple studies. To this end, the ICARUS group proposes to pursue a heterogeneous information fusion approach with passive EO/IR and progressing to a “richer” passive/active sensor suite. Prior ICARUS research partially funded by DHS ADAC (Department of Homeland Security, Arctic Domain Awareness Center of Excellence) led to development of the SMDSI (Software Defined Multi-Spectral Imager) to detect and track marine traffic. This existing hardware will be adapted to accommodate additional sensors including acoustic, ADS-B (Automatic Dependent Surveillance – Broadcast), primary/secondary RADAR, and LIDAR in order to accelerate the development of optimal methods of drone detection, classification, and identification.

The overall vision is to create a network of passive/active drone detection, classification, and identification nodes to enhance security and safety for drone operations that surpasses ADS-B and registration alone.

Page 2: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

2

a) Background and Introduction Importance: As reported by NIAG (NATO Industrial Advisory Group) in 2015, around 30,000 Unmanned Aerial Systems (UASs) could be flying in the US in the next decade. Considering the low price and easy control of these UASs, in addition to the fact that many of them might not be registered by FAA, and not be equipped with communication devices can pose a substantial security threat [1]. The importance of this topic has made some government agencies and private companies attempt to detect, classify, and neutralize drones in sensitive areas, such as using the FBI drone detection system at the JFK airport [2].

Technical Challenges: Detection of Low altitude-Slow-Small (LSS) UASs is an open research problem and no company so far has provided the statistical evidence necessary to support robust detection, and identification of LSS UAS targets. Some challenges are: background clutter (e.g., clouds, rain, glare, fog), sensor noise, background noise (for acoustic sensors), other non-threating targets (e.g., birds or flying insects), tiny unrecognizable appearance of small aerial objects from a long distance, existence of non-cooperative drones, drones not equipped with communication devices, or hijacked drones sending misguiding signals. Due to challenges presented by non-compliant and even hostile drone traffic, this task that cannot be achieved with a single detection modality for all potential targets [1, 4, and 35].

Background Work: Although considerable research and development has been done by private companies in the area of detection, classification, and even neutralization of unmanned aircraft [5-14], published work accessible to the research community and public can rarely be found. As a result, an open-architecture research platform that can be utilized for testing and expanding the functionality of these systems is lacking.

Novelty of the proposed Work: The overall vision of this proposal is to create an open-architecture research platform that employs a network of sensors with different modalities, external information, and the state-of-the-art machine learning algorithms for detection and classification of cooperative and non-cooperative unmanned aircraft. The proposed system will provide higher security and safety for drone operations as compared to using ADS-B and registration alone. Major novelties and contributions of this proposal include the following:

1. Creation of an open hardware-software architecture, which can be shared with other researchers to employ different sensors and algorithms for UAS detection and classification.

2. Employment of sensor and information fusion: passive sensors, including EO, IR, and acoustic, in addition to ADS-B will be used in the beginning of the project and later actives sensors, including RADAR/LIDAR will be utilized; External information, such as registered flights on flightradar24.com [15] will be incorporated into the system. It worth mentioning that prior ICARUS group research at the Prescott campus, which led to development of the SMDSI (Software Defined Multi-Spectral Imager) for detecting and tracking marine traffic [16, 17] will be adapted to accommodate additional sensors for detecting UAVs.

3. Deployment of a network of sensors to different locations across campus to extend the detection space and to compensate for the limitations of passive sensors (e.g., limited field of view and IR/EO camera range).

4. Inverted approach in machine learning by using time history of tracked UASs for identification and classification purposes.

5. Outreach: hosting competitions, in which the students are challenged to localize the Drone-Net sensors by flying their drones over the campus without or before being detected by the system.

b) Methodology and Approach Use of EO/IR instrumentation alone requires significant visual intelligence to segment drones, aircraft (fixed wing and rotary) and many other flying objects in the field of view including birds, bugs, clouds, atmospheric thermal variation, and horizon obstructions, as shown in Figures 1 and 2. In preliminary tests with a roof-top

Page 3: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

3

Figure 1: Insects EO/IR Imaging.

Figure 3: Drone EO/IR Signature.

thermal and visible camera similar to the proposed Software Defined Multi-Spectral Imager (SDMSI) [17], a short 12 hour test revealed the many challenges associated with EO/IR. At the same time, these preliminary results are promising in that manual classification of drones, planes, and other between multiple researchers using a customized frame-by-frame review tool resulted in 99.66% agreement in classification by three independent human visual reviewers [17]. Truth models for non-compliant flying objects can only come from

human review such as this, but for GPS and ADS-B compliant drones and aircraft, truth models can of course also be based on spatial, temporal, self-reporting compared to observation.

Use of Salient Object Detectors (SODs) is one method proposed to automate flying object detection most often based on adaptive noise-filtered motion triggers. SODs typically are based on behavior (motion and characteristics of the motion) [18], visual properties (contrast, texture, gradient edges, color) [19], or object characteristics (shape, symmetry, profile) [20]. For example, bugs are easily distinguished by flight trajectory and shape when manually reviewed, but capturing this into an automated algorithm will be explored with deep learning (artificial neural network machine

learning) as well first principles (physics of flight and shape). With a simple motion trigger based on statistic change in pixel intensity, gradients, and extents, bugs are easily detected, but are not always easily distinguished from drones or aircraft, as shown in Figures 3 and 4. This is necessary to control the Receiver Operator Characteristic (RoC) just as is the case with RADAR and LIDAR operation [21, 22].

It is envisioned that this work must integrate a wide number of visual cues to detect, segment, classify, and ultimately identify various flying objects visible to an EO/IR system like the SDMSI with significance real-time image processing, scene reasoning (either from first principles or machine learning) combined with secondary cues (audio), active sensors added in later stages for verification and improvement (LIDAR and RADAR) as well as information fusion (comparison of observed aerial objects to expected based on ADS-B reporting, flightradar24 aggregation). The goal is to provide awareness equal to or better than that which a human observer could provide if asked to monitor drone activity around a campus, airport, or other area where drone activity must be monitored and controlled [23]. This can be combined with compliant drone methods such as geo-fencing (GPS keep-out areas) and ADS-B reporting. Likewise, interesting new research pursuits can be enabled by the SDMSI and multi-modal sensing such as bird and aviation interaction, working with scientists who study animal behavior, as a potential follow-on or parallel research effort once basic principles of operation are established through this proposed program.

The ultimate goal for this research project is to provide information fusion, which could compliment ADS-B and aggregation services such as flightradar24 such that share national airspace and within localities (campuses) for drones and civil aviation is more safe and secure. Not all drones will be compliant and could perhaps even be hostile. While flight regimes for various aerial objects tend to naturally keep them apart, it is critical that this be monitored for safety and security. This can only enhance methods of geo-fencing and compliant self-reporting. The ground Drone Net might be enhanced in future work to include drone see-and-avoid

Figure 2: Birds

Page 4: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

4

Figure 4: Aircraft EO/IR Signature.

instrumentation as well. First, smart EO/IR and multi-modal sensor systems must be researched for use in this domain and analyzed for effectiveness. Simply installing cameras and even multi-spectral instruments is not sufficient, the visual, spatial, temporal reasoning must also be encapsulated to assist pilots (both drone and civil), air-traffic control and for enforcement and compliance.

A key component in the proposed approach for the detection, classification, and identification of flying objects is the ability to predict their spatio-temporal behavior in real time. This information enables efficiency improvement of the vision tracking algorithm and, at the same time, allow to assessment of the risk level associated with semi-compliant or non-compliant drones. Several works have been proposed in the literature to address this task that assume a known analytic dynamic model of a flying object and its trajectory and perform parameter estimation using recursive least squares [24, 25] in conjunction with Extended Kalman Filter (EKF) [26, 29].

On the other hand, given the large variety of flying objects potentially considered in this research effort, such as birds, bugs, small UAS, and commercial aircrafts, a learning based approach has the potential to be more accurate and less CPU-intense. The key component of such methodology is that the dynamics of a moving object can be modelled by observing examples of its motion in space and training a machine learning algorithm [27]. A moving object can be modeled by a second order model:

�̈�𝒙 = 𝒅𝒅(𝒙𝒙, �̇�𝒙)

where the vector 𝒙𝒙 ∈ ℝ𝑟𝑟 is the pose of the object, i.e., its position and orientation. In order to learn the dynamic function 𝒅𝒅(∙), 𝑆𝑆 training trajectories {𝒙𝒙, �̇�𝒙, �̈�𝒙}𝑡𝑡,𝑘𝑘 are observed over 𝑇𝑇 + 1 time steps, with 𝑡𝑡 = 0, … ,𝑇𝑇 and 𝑘𝑘 = 1, … , 𝑆𝑆. The dynamic function 𝒅𝒅(∙) can be then modelled using Support Vector Regression (SVR) [28], which perform a non-linear regression from [�̇�𝒙, �̈�𝒙] to �̈�𝒙:

�̈�𝑥𝑖𝑖 = 𝑑𝑑𝑆𝑆𝑆𝑆𝑆𝑆,𝑖𝑖([�̇�𝒙, �̈�𝒙]) 𝑖𝑖 = 1, … , 𝑟𝑟

𝑑𝑑𝑆𝑆𝑆𝑆𝑆𝑆,𝑖𝑖 = � 𝛼𝛼𝑖𝑖,𝑚𝑚 𝐾𝐾([�̇�𝒙, �̈�𝒙], [�̇�𝑥𝑖𝑖, �̈�𝑥𝑖𝑖]𝑚𝑚)𝑀𝑀

𝑚𝑚=1

+ 𝛽𝛽𝑖𝑖

The support vectors [�̇�𝒙, �̈�𝒙]𝑚𝑚 are used for the regression and 𝛼𝛼𝑖𝑖,𝑚𝑚 ≠ 0 are the regression coefficients. Several models are suggested in the literature for the kernel function 𝐾𝐾: ℝ2𝑟𝑟 → ℝ such as the Radial Basis Function (RDF) 𝐾𝐾(𝒙𝒙,𝒙𝒙𝑚𝑚) = 𝑒𝑒𝛾𝛾‖𝒙𝒙−𝒙𝒙𝑚𝑚‖22 [27].

After a dynamic function 𝒅𝒅(∙) has been identified for each specific class of flying objects considered in this work, it will be stored in a sparse dictionary [30]. The on-line trajectory prediction step will be then subdivided in three main components:

1. The relative motion of a newly detected flying object will be captured by means of rigid motion segmentation based on the proposed sensing system and its trajectory will be decomposed in rotational and translational motion primitives [31].

2. The sparse dictionary will be used to both classify the object to be analyzed [32] and to associate it with a specific dynamic function 𝒅𝒅(∙) generated using the SVR algorithm.

3. Third, a spatio-temporal representation of the predicted trajectory will be created by integrating the dynamic function and will be feeder back into the vision algorithm.

Page 5: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

5

c) Significance The challenge and opportunity presented by use of UAS “drones” in the national airspace (NAS) has historic significance not seen since the early days of aviation growth after the First World War. The FAA estimates that by 2020 the drone market will be $98 billion with 7 million drones added annually [33]. This huge number compared to current estimates by NIAG [1] and Drone Industry Insights [35] exhibits the non-linear growth anticipated in just five short years by the FAA. Market beneficiaries include industrial inspection, aerial photography, insurance, agricultural and government services [33]. While ADS-B for drones, along with registration, has been proposed as a quick fix, to allow drones into the NAS and to share populated areas, it is not clear how this will work for all types of drones ranging from professional service to hobby. For example, many drones will be fully compliant, but some may be semi-compliant (e.g., low quality position reporting due to degraded GPS), and others perhaps even totally non-compliant or hostile [34]. Research by Sandia National Labs has shown that drones typically have very low RADAR cross-section area, similar to stealth aircraft [1] and can present a significant security and safety threat. The risk is that even one or two national security incidents involving service drones, hobby, or terrorism could result in the grounding of all drones in the NAS.

d) Project Personnel and Work Plan Team Members Dr. Sam Siewert (PI): Assistant Professor, Computer and Software Engineering, ERAU Prescott. [CV, Bio]. Dr. Siewert will serve as the principle investigator on this research project and his level of effort will be full-time during the summer and 25% during the fall and spring terms with an associated teaching load not to exceed 9 credit hrs. Dr. Siewert’s role will be that of project administrator and lead researcher. He will define the passive EO/IR sensing algorithms and implementation. Dr. Mehran Andalibi (Collaborator): Assistant Professor, Mechanical Engineering, ERAU Prescott. [CV] Dr. Andalibi will serve as a team member on this research project and his level of effort will be 75% during the summer and 25% during the fall and spring terms with an associated teaching load not to exceed 9 credit hrs. Dr. Andalibi’s role will be to lead the salient object detection, classification and identification efforts. Dr. Stephen Bruder (Collaborator): Associate Professor, Electrical Engineering, ERAU Prescott. [CV] Dr. Bruder will serve as a team member on this research project and his level of effort will be 75% during the summer and 25% during the fall and spring terms with an associated teaching load not to exceed 9 credit hrs. Dr. Bruder’s role will be to define, develop, and implement the necessary data fusion algorithms. Dr. Iacopo Gentilini (Collaborator): Assistant Professor, Aerospace Engineering, ERAU Prescott. [CV] Dr. Gentilini will serve as a team member on this research project and his level of effort will be 25% during the summer and 25% during the fall and spring terms with an associated teaching load not to exceed 9 credit hrs. Dr. Gentilini’s role will be lead the machine learning portion of this project and LIDAR integration. Mr. Jim Weber: ICARUS Lab Manager, ERAU Prescott. Mr. Weber will serve as a support member on this project and his level of effort will be 25% during the summer and 10% during the fall and spring terms. His support will be funded from overhead. Mr. Weber will support the preparation and execution of various experiments and serve as interface to ERAU’s laboratory resources.

Schedule/Work Plan Table 2: Project Timeline, Millstones, and Deliverables.

Task Description Start End

year

1

1.1 Develop Drone-Net performance specifications and functional requirements. Define both stand-alone and networked cases 07/01/2017 08/01/2017

1.2 Develop preliminary draft of project test/experiment plans 08/01/2017 09/01/2017

Page 6: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

6

2 Transition existing pre-prototype EO/IR to a prototype. Select/interface/integrate suitable EO cameras and SWIR / LWIR cameras 08/01/2017 10/01/2017

3 Integrate acoustic sensor into prototype 10/01/2017 12/01/2017

4 Integrate flightradar24 into prototype 12/01/2017 12/31/2017

5 Integrate ADS-B receiver into prototype 01/01/18 02/28/2018

6 Develop and implement preliminary information fusion approach 01/01/18 05/15/2018

7 Conduct stand-alone tests of prototype 05/15/2018 06/20/2018 Milestone: Demo stand-alone prototype at ERAU Daytona to ASSURE group 06/25/2018 06/29/2018 Deliverable: Annual progress report to include analysis of test results 06/30/2018

year

2

8 Develop machine learning algorithms. Simulate algorithms with synthetic data, and transition to off-line implementation. 07/01/2018 10/31/2018

9 Develop salient object detection, classification, and identification algorithms. Behavioral, feature, and pixel-level approaches, and transition to off-line implementation.

09/01/2018 12/31/2018

10 Replicate stand-alone prototype. Expand to networked configuration. 01/01/2019 03/31/2018

11 Develop acoustic localization algorithms. Simulate algorithms, and Transition to off-line implementation. 01/01/2019 05/15/2018

12 Conduct tests of networked prototypes 05/15/2019 06/20/2019 Milestone: Demo networked prototype at ERAU 06/24/2019 06/28/2019 Deliverable: Annual progress report to include analysis of test results 06/30/2019 Deliverable: Submit an external grant proposal. SBIR/DHS Center of Excellence 12/31/2019

year

3

13 Select, procure, and integrate LIDAR sensor 07/01/2019 10/31/2019

13 Select, procure, and integrate RADAR sensor 11/01/2019 12/31/2019

14

Develop robust sensor integration algorithms • Select between data-level, feature-level, or object-level fusion

topology • Develop and simulate active/passive fusion algorithms • Implement (off-line) sensor fusion algorithms

01/01/2020 04/30/2020

15 Regression test the updated networked configuration 05/01/2020 06/15/2020 Milestone: Demo networked prototype at ERAU 06/24/2020 06/28/2020 Deliverable: Final report to include performance, specification, and TRL4 design. 06/30/2020

Deliverable: Submit an external grant proposal. NASA/NSF/ONR/DoD/FAA/ADAC 12/31/2020

e) Proposed Budget The project budget is reported in Table 3 and specific details are provided in Appendix 1.

Table 3: Project Budget.

Faculty Salary

Equipment (Non-

capital)

Annual Total

Cost Share (Student Salary)

Cost Share (Capital

Equipment)

Annual Total + CS

Year 1 $ 35,000 $ 15,000 $ 50,000 $ 5,000 $ 10,000 $ 65,000

Year 2 $ 35,000 $ 15,000 $ 50,000 $ 5,000 $ 10,000 $ 65,000

Year 3 $ 35,000 $ 15,000 $ 50,000 $ 5,000 $ 10,000 $ 65,000

TOTAL $ 105,000 $ 45,000 $ 150,000 $ 15,000 $ 30,000 $ 195,000

Page 7: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

7

References

[1] Birch, Gabriel Carisle, John Clark Griffin, and Matthew Kelly Erdman. UAS Detection Classification and Neutralization: Market Survey 2015. No. SAND2015-6365. Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States), 2015.

[2] FAA Tests FBI Drone Detection System at JFK, Federal Aviation Administration, July, 2016. (retrieved on 12/1/2016)

[3] Micro-Avionix, Ping2020 ADS-B transponder for UAS. [4] McNeal, Gregory S. "Unmanned Aerial System Threats: Exploring Security Implications and Mitigation

Technologies." Available at SSRN (2015) [5] http://www.srcinc.com/what-we-do/ew/counter-uas.aspx (12/1/2016) [6] http://www.detect-inc.com/drone.html (12/1/2016) [7] http://www.iecinfrared.com/products/integrated-systems/narcissus/ (12/1/2016) [8] http://www.dedrone.com/en/ (12/1/2016) [9] https://www.droneshield.com/ (12/1/2016) [10] https://www.x20.org/uav-drone-detection-anti-drone-defense-systems/ (12/1/2016) [11] http://industrialcamera.net/security-surveillance-system-products-services/uav-detection/ (12/1/2016) [12] https://www.hgh-infrared.com/es/Aplicaciones/Seguridad/UAV-Detection (12/1/2016) [13] http://www.cerbair.com/en/ (12/1/2016) [14] http://www.israeldefense.co.il/en/content/rafael-unveils-drone-dome-drone-detection-and-neutralization-

system (12/1/2016) [15] flightradar24.com, ADS-B, primary/secondary RADAR flight localization and aggregation services. [16] Arctic Domain Awareness Research Center, U. of Alaska Anchorage, Theme 3, Task 2, SmartCam. [17] S. Siewert, M. Vis, R. Claus, R. Krishnamurthy, S. B. Singh, A. K. Singh, S. Gunasekaran, Image and

Information Fusion Experiments with a Software-Defined Multi-Spectral Imaging System for Aviation and Marine Sensor Networks, (in preparation) AIAA SciTech, Grapevine, Texas, January 2017.

[18] Wang, Bin, and Piotr Dudek. "A fast self-tuning background subtraction algorithm." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 2014.

[19] Hou, Xiaodi, and Liqing Zhang. “Saliency detection: A spectral residual approach.” Computer Vision and Pattern Recognition, 2007. CVPR‘07. IEEE Conference on. IEEE, 2007.

[20] Cheng, Ming-Ming, et al. “BING: Binarized normed gradients for objectness estimation at 300fps.” IEEE CVPR. 2014.

[21] Richards, Mark A., James A. Scheer, and William A. Holm. Principles of modern radar. SciTech Pub., 2010. [22] Brown, Christopher D., and Herbert T. Davis. "Receiver operating characteristics curves and related decision

measures: A tutorial." Chemometrics and Intelligent Laboratory Systems 80.1 (2006): 24-38. [23] Toet, A. (2011). Computational versus psychophysical bottom-up image saliency: A comparative evaluation

study. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 33(11), 2131-2146. [24] W. Hong and J.-J. E. Slotine, Experiments in hand-eye coordination using active vision, in Proc. 4th Int.

Symp. Exp. Robot. IV., London, UK: Springer-Verlag, 1997, pp. 130–139. [25] M. Riley and C. G. Atkeson, Robot catching: Towards engaging human humanoid interaction, Auton. Robots,

vol. 12, no. 1, pp. 119–128, 2002. [26] U. Frese, B. Bauml, S. Haidacher, G. Schreiber, I. Schaefer, M. Hahnle, and G. Hirzinger, Off-the-shelf vision

for a robotic ball catcher, in Proc. IEEE/RSJ Int. Conf. Intell. Robot. Syst., 2001, vol. 3, pp. 1623–1629. [27] S. Kim and A. Billard, Estimating the non-linear dynamics of free-flying objects, Robot. Auton. Syst., vol. 60,

no. 9, pp. 1108–1122, 2012. [28] C.-C. Chang and C.-J. Lin, LIBSVM: A library for support vector machines, ACM Trans. Intell. Syst. Technol.,

vol. 2, pp. 27:1–27:27, 2011. [29] A. L. Barker, D. E. Brown, and W. N. Martin, Bayesian estimation and the Kalman filter, Comput. Math.

Appl., vol. 30, no. 10, pp. 55–77, 1995.

Page 8: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

8

[30] Qiang Qiu, Zhuolin Jiang, and Rama Chellappa. Sparse dictionary-based representation and recognition of action attributes. In Computer Vision (ICCV), 2011 IEEE International Conference on, pages 707–714, 2011.

[31] Wenze Hu and Song-Chun Zhu. Learning 3d object templates by quantizing geometry and appearance spaces. IEEE Trans- actions on Pattern Analysis & Machine Intelligence, (6):1190–1205, 2015.

[32] Renaud Detry, Carl Henrik Ek, Marianna Madry, and Danica Kragic. Learning a dictionary of rototypical grasp-predicting parts from grasping experience. In Robotics and Automation (ICRA), 2013 IEEE International Conference on, pages 601–608, 2013.

[33] FAA Aerospace Forecast, Fiscal Years 2016-2036, Federal Aviation Administration [34] How consumer drones wind up in the hands of ISIS fighters, Techcrunch, October 13, 2016. [35] Drone Industry Insights, https://www.droneii.com/

Page 9: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

9

Curriculum Vitae (Principle Investigator)

Dr. Samuel B. Siewert [email protected]

http://mercury.pr.erau.edu/~siewerts

Embry Riddle Aeronautical University, Office 145, King Building, 3700 Willow Creek Rd, Prescott, AZ 86301

Cell: (303) 641-3999, Office: (928) 777-6929

Department, college and campus: Computer, Electrical, and Software Engineering, College of Engineering, Prescott Campus, ERAU

Highest degree attained, year, and institution: Ph.D. Computer Science, 2000, University of Colorado, Boulder

Short paragraph describing research expertise: Dr. Siewert’s research focus has been in the area of embedded systems with an emphasis on autonomous systems, computer and machine vision, hybrid reconfigurable architecture, and operating systems. Related research interests include real-time theory, digital media and fundamental computer architecture.

Total number of publications: Dr. Siewert’s publication list includes:

• Two textbooks and contributor on a third, • Two journal papers, • More than nine invited panel/workshop presentations, • Twenty conference publications, and • Forty two R&D publications

Five of the most relevant publications to this proposal: 1. S. Siewert, M. Vis, R. Claus, R. Krishnamurthy, S. B. Singh, A. K. Singh, S. Gunasekaran, “Image and

Information Fusion Experiments with a Software-Defined Multi-Spectral Imaging System for Aviation and Marine Sensor Networks”, (in preparation) AIAA SciTech, Grapevine, Texas, January 2017

2. S. Siewert, V. Angoth, R. Krishnamurthy, K. Mani, K. Mock, S. B. Singh, S. Srivistava, C. Wagner, R. Claus, M. Vis, “Software Defined Multi-Spectral Imaging for Arctic Sensor Networks”, SPIE Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XXII, Baltimore, Maryland, April 2016.

3. S. Siewert, J. Shihadeh, Randall Myers, Jay Khandhar, Vitaly Ivanov, “Low Cost, High Performance and Efficiency Computational Photometer Design”, SPIE Sensing Technology and Applications, SPIE Proceedings, Volume 9121, Baltimore, Maryland, May 2014.

4. S. Siewert, M. Ahmad, K. Yao, “Verification of Video Frame Latency Telemetry for UAV Systems Using a Secondary Optical Method”, AIAA SciTech, National Harbor, Maryland, January 2014.

Page 10: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

10

5. S. Siewert, “Big data interactive: Machine Data Analytics – Drop in Place Security and Safety Monitors”, IBM developerWorks, January 2014.

Total external funding: Total external funding to date is $90K (combined from Intel Corp., U. of Alaska, and DHS ADAC).

Five of the most recent external awards (Pi or co-I; amount; title; funding agency): 1. $25K; 2016 Co-I; Internal Grant for Undergraduate Research; Embry Riddle Aeronautical University 2. $23K; 2015-16 PI; SmartCam, Arctic Domain Awareness; DHS, U. of Alaska, Theme 2, SmartCam 3. $17K; 2014-15 Co-I; SmartCam Concept; U. of Alaska Anchorage Arctic Domain Awareness Ctr. 4. $90K; 2014-15 PI; Erasure Code Algorithm and Performance Analysis; Transductive LLC.

List any patents or other intellectual property developments: 1. US Pat. 8,473,779 – Systems and methods for error correction and detection, isolation, and

recovery of faults in a fail-in-place storage array, granted June 25, 2013. 2. US Pat. 7,370,326 - Prerequisite-based scheduler, granted May 6, 2008.

List graduate student research advising: 1. M.S. committee, “Towards a Mathematical Model for Autonomously Organizing Security Metric

Ontologies”, Bryce Barrette, 2016, Embry Riddle Prescott. 2. Dissertation committee, “Robust QoS Scheduler in Open Real-Time Systems”, Wang-ting Lin, 2009 3. ECEN 8990, Doctoral Thesis Advising, Computer Vision, Spring 2014, University of Colorado

Boulder. 4. ECEN 8990, Doctoral Thesis Advising, Computer Architecture, Fall, Spring 2004, 2005 5. ECEN 5840, M.S. independent research, Internet of Things, Saliency, Fall 2016 6. ECEN 5840, M.S. independent research, RT Salient Object Detection, Summer 2016 7. ECEN 5840, M.S. independent research, Real-Time Sensor Fusion, Fall 2015, Spring 2016 8. ECEN 5840/41, M.S. independent study, Real-Time Computer Vision, Fall 2013, Spring 2014, 2015 9. ECEN 5840, Graduate Design, Ultrasound Venous Flow Visualization, Fall 2011 10. ECEN 5840, M.S. independent study, Real-Time Digital Video, Summer 2006, Fall 2003, 2006 11. M.S. independent research, “A Hybrid Sign Language Recognition System”, Van Culver, 2004, CU

Boulder.

Page 11: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

11

Curriculum Vitae (Collaborator)

Dr. Iacopo Gentilini [email protected]

http://robotics.pr.erau.edu

Embry Riddle Aeronautical University, Office 106, King Building, 3700 Willow Creek Rd, Prescott, AZ 86301

Office: (928) 777-6626

Department, college and campus: Mechanical Engineering, College of Engineering, Prescott Campus, ERAU

Highest degree attained, year, and institution: Ph.D. Mechanical Engineering, 2012, Carnegie Mellon University

Total number of publications: Dr. Gentilini’s publication list includes:

• Five journal papers. • Eleven conference publications. • Three invited speaker presentations.

Five of the most relevant publications to this proposal: 1. N. Muraleedharan, F. Tempia, A. Tempia, I. Gentilini, Bayesian Multi-model Regression and Frequency

Analysis for Subgroup Identification. IEEE Transaction on Pattern Analysis and Machine Learning, in preparation, projected submission March 2017.

2. I. Gentilini, F. Margot, K. Shimada, The Traveling Salesman Problem with Neighborhoods: MINLP Solution, Optimization Methods and Software, Volume 28, Issue 2, pp. 364 378, 2013.

3. M. Muraleedharan, D. Isenberg, I. Gentilini, Recreating Planar Free-Floating Environment via Model-free Force-Feedback Control, Proceedings of the 2016 IEEE Aerospace Conference (AeroConf 2016), pp. 1 12, Big Sky, Montana, USA, March 5 12, 2016.

4. K. Vicencio, T. Korras, K. Bordignon, I. Gentilini, Energy-Optimal Path Planning for Six-Rotors on Multi-Target Missions, Proceedings of the IEEE International Conference on In-telligent Robots and Systems (IROS 2015), pp. 2481 2487, Hamburg, Germany, September 28 October 2, 2015.

5. F. Margot, I. Gentilini, K. Shimada, The Traveling Salesman Problem with Neighborhoods: MINLP Solution, 21st International Symposium on Mathematical Programming, Berlin, Germany, August 19 24, 2012.

Five of the most recent external awards: 1. PI; $33,000.00; Dictionary Based Design Classifier and Optimizer (proposal awaiting Review); Google

Faculty Research Award. 2. PI; $30,000.00; Predictive Subgroup Identification (award pending), InnoCentive.

List graduate student research advising: 1. Narendran Muraleedharan, Embry-Riddle Aeronautical University (June 2015 present). 2. Tian Wang, Carnegie Mellon University (October 2011 - May 2012). 3. CheongKin Ng, Carnegie Mellon University (October 2009 - December 2010). 4. Saikumar Reddy, Carnegie Mellon University (October 2009 - December 2010).

Page 12: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

12

Letters of Support

1. Dr. Ron Madler, Dean of the College of Engineering, Embry Riddle Aeronautical University, Prescott. 2. Dr. Richard Stansbury, Associate Professor of Computer Engineering and Computer Science and Master's

Program Coordinator, Unmanned and Autonomous Systems Engineering, Embry Riddle Aeronautical University, Daytona Beach.

3. Major General Randy “Church” Kee (Retired), Executive Director, Arctic Domain Awareness Center of Excellence (DHS), University of Alaska Anchorage.

4. Dr. Michael Hatfield, Associate Director of Education, Alaska Center for Unmanned Aircraft Systems Integration, University of Alaska Fairbanks.

5. Dr. Scott Palo, Victor Charles Schelke Endowed Professor, Associate Dean of Research, Research and Engineering Center for Unmanned Vehicles (RECUV), Colorado Center for Astrodynamics Research (CCAR), University of Colorado Boulder.

6. Professor Andrew Femrite, Senior Instructor and Faculty Director, Embedded Systems Engineering Program, University of Colorado Boulder.

Page 13: Drone Net: Information Fusion Networks for Reliable ...mercury.pr.erau.edu/~siewerts/extra/papers/ICARUS-Drone-Net-Proposal-and-Concept.pdfDrone Net: Information Fusion Networks for

13

Appendix 1: Detailed Equipment Budget

Drone-Net: Information Fusion Networks for Reliable Localization

Assembly Name : Drone-Net [42]

Assembly Number : Budget/BOM

Assembly Revision : REV1.0

Approval Date : 03-Dec-16Part Count : 26Total Cost : $74,999.99

Part # Part Name Description Qty Units Picture Unit Cost Cost Comments

FLIR Boson 320 - 16° (13.8mm) - Professional

FLIR Boson™ longwave infrared

(LWIR) thermal camera

embedded Movidius Myriad 2 12-

core image processor, wide range

of lens options, and FLIR's XIR™

expandable infrared video

processing architecture

4 each $ 1,411.25 $ 5,645.00

Point Grey - Flea3 3.2MP Color USB 3.0

(Sony IMX036) Camera

FLIR / Point Gray megapixel

USB3 cameras

The USB 3.0 interface used by the

Flea3 camera provides guaranteed

delivery of critical image data using

USB 3.0 (5 Gbit/s) bulk transfers.

4 each $ 795.00 $ 3,180.00

NVIDIA Tesla K40CNVIDIA Tesla K40 GPU

Accelerator (Active Cooling)

a GPU with no video outputs

designed exclusively for providing

acceleration to assist

computational intensive tasks such

as transcoding video, rendering 3D

models, cryptography, and analysis

of complex data sets.

4 each 3,190.04$ 12,760.16$

Jetson TX1 Developer Kit

JETSON TX1 DEVELOPER KIT

Full-featured development

platform for AI

This kit is a full-featured

development platform for visual

computing designed to get you up

and running fast. It comes pre-

flashed with a Linux environment,

includes support for many common

APIs, and is supported by NVIDIA's

complete development tool chain

4 each 599.00$ 2,396.00$

LandMark™70 IMUNON-ITAR MEMS IMU's enabled by our G150Z gyros

The LandMark™70 IMU is one of the

world's highest performance NON-

ITAR MEMS IMU's enabled by our

G150Z gyros, which offer

0.0009°/sec/√Hz (~0.038°/√Hr)

4,649.00$

Estimate, RFQ

sent

FreeFly ALTA 8 with FREE M5 950-00049

With the addition of 2 extra motors

on the Freefly Alta 8, this

Octocopter will lift up to 20 pounds

of payload verse 15 pounds from

the Freefly Alta 6. Adding stability

of flight, power and increased

payload are some of the benefits of

the Alta 8.

1 each 17,495.00$ 17,495.00$

FreeFly ALTA 6 with FREE M5 950-00030

The Freefly ALTA unpacks, ready to

shoot, in under five minutes. It is

easy to fly, powerful, rigid,

adaptable, reliable, and optimized

for up to fifteen-pound payloads.

Confidently fly RED, ARRI and other

professional cameras, guided by

our state-of-the-art Synapse Flight

Controller. Capture aerials like

never before with the option to

mount the MoVI on top.

1 each 11,995.00$ 11,995.00$

Intel Xeon E7-8867 v4 or Intel Core i7-

6900K ProcessorManufacturer Part # 00001

The Intel® E7-8867L 2.13 GHz Ten

Core Processor enhances the

performance and the speed of your

system. Additionally, the

Virtualization Technology enables

migration of more environments. It

supports enhanced SpeedStep®

technology that allows tradeoffs to

be made between performance and

power consumption.

1 each 11,999.99$ 11,999.99$

NETGEAR ReadyNAS 316 6-Bay Network

Attached Storage Diskless RN31600

2.1 GHz dual-core processor and

2GB on-board memory1 each 599.00$ 599.00$

CORSAIR Dominator Platinum 64GB CMD64GX4M4B3333C288-Pin DDR4 SDRAM DDR4 3333

(PC4 26600) Desktop Memory1 each 499.99$ 499.99$

Samsung SSD 950 PRO 512GB NVMe MZ-V5P512BW

Max Sequential Read Up to 2500

MBps; Max Sequential Write Up to

1500 MBps; 4KB Random Read Up to

300,000 IOPS (4KB, QD32)

3 each 342.95$ 1,028.85$

Samsung 55-Inch 4K Ultra HD Smart LED

TVUN55JS7000 1 each 1,199.00$ 1,199.00$

NI PCIe-6353 (DAQ board) X Series Data Acquisition

NI X Series multifunction data

acquisition (DAQ) devices provide a

new level of performance with the

high-throughput PCI Express bus, NI-

STC3 timing and synchronization

technology, and multicore-

optimized driver and application

software. 

1 each 1,553.00$ 1,553.00$

Total 26 74,999.99$