senior design report

25
Engineering Design 2: Child Drone Jeremy Goldberg, Andrew Blalock, Wilbert Ramos, Kim Lewis, Colin Cassidy, Bora Ocbe Advisor: Dr. Kalva FAU College of Engineering and Computer Science July 23, 2015 Project Summary: In order to create an autonomous drone that can be implemented in a dynamic environment, many variables need to be taken into consideration. Using an open source flight controller called the Crazyflie, feedback from various sensors including a barometer, magnetometer, accelerometer, gyroscope, ping sensors and cameras, our team designed a unit that can be carried on top of another larger drone. The drone will be tasked with detecting vegetation states and gathering data from a sensor station over wifi.

Upload: jeremy-goldberg

Post on 23-Feb-2017

308 views

Category:

Engineering


1 download

TRANSCRIPT

Page 1: Senior Design Report

Engineering Design 2: Child Drone

Jeremy Goldberg, Andrew Blalock, Wilbert Ramos, Kim Lewis, Colin Cassidy, Bora Ocbe

Advisor: Dr. Kalva

FAU College of Engineering and Computer Science

July 23, 2015

Project Summary:

In order to create an autonomous drone that can be implemented in a dynamic

environment, many variables need to be taken into consideration. Using an open source flight controller called the Crazyflie, feedback from various sensors including a

barometer, magnetometer, accelerometer, gyroscope, ping sensors and cameras, our team designed a unit that can be carried on top of another larger drone. The drone will

be tasked with detecting vegetation states and gathering data from a sensor station over wifi.

Page 2: Senior Design Report

1

Table of Contents

Executive Summary 2.

1. Introduction 4.

1.1 Product Description 4.

1.2 Significance of Problem 4.

1.3 Overview Of Solution 4.

1.4 Goals and Objectives 4.

1.5 Related Research 5.

2. System Design 5.

2.1 Project Requirements 6.

2.2 Project Requirements Not Met 6.

2.3 Project Requirements Added 6.

2.4 System Diagram 7.

2.5 State Diagram 7.

3. System Implementation 9.

3.1 Hardware 9.

3.2 Software 19.

3.3 User Interface 21.

3.4 Data Communications 21.

3.5 Testing 21.

4. Budget 22.

5. References 24.

Page 3: Senior Design Report

2

Executive Summary In today’s world technology is growing at an exponential rate and we need to find out how to take that technology and use that technology to our benefit. One use we see of that technology on a day to day basis, either on the news or from some at home hobbyist is the use of drones. This can very from taking pictures or in certain situations used in war time state of affairs to keep the armed services men and women safe. In this case, we are looking through a narrower perspective to execute a more specific task. One problem that we see with drones is the amount of time that a drone can fly. A major burden in this area is the weight of a drone. The problem with this is the longer you want to have a drone fly, the bigger the battery needs to be to provide more power. The issue that arises is when selecting a larger battery, you make the drone heavier and when making the drone heavier, you need bigger motors to fly the drone which requires more power from the battery. This can cause a major problem in the world that we live in. There are plenty of areas that humans cannot get close enough too to launch a drone and situations that we would need drones for. Examples of some of the cases that the scenario above would occur in would be possible rescue missions in desolate areas. Many areas which don’t have infrastructure are difficult to observe. When people are lost in such areas, typically helicopters are used to search for people but there is a disadvantage to this method. The disadvantage to this method is that we as people doing the searching are human and cannot see through the dense environment. A way that this could be solved would be to attach a camera to a small drone and launch it through an intense region so that a search could be conducted without the blockage of possible foliage or other obstructions. But as stated before the flight time of the drone would not be very high because of the limits of the size of the battery and the weight of the drone. Another issue that could arise into from this scenario would be having a pilot fly this drone in a remote surrounding and having less eyes looking for the lost people or doing another task that could potentially save a life. So it would be more cautious to make a drone that would have autopilot and it could alert someone if something is found by way of imaging processing or shape recognition when finding a human shape and sending a locating back remotely if possible. This is just one scenario that could be thought of when thinking about drones. There are many more other applicable circumstances but how does this play into what we have been thinking about in our project.Why not have two different types of drones and create a large drone that will transfer a smaller drone to a designated area and therefore allow it more flight time in it’s designated area so that it will have more time to complete its assigned task or tasks. The larger drone (mother drone) would have a larger range and this would transport the smaller drone (child drone) to a designated area to complete its assigned task. Since the child drone has stricter requirements on flight time, it would be more efficient to have a larger vessel for initial transportation like the mother drone. The general objective of this project was to design an autonomous miniature drone that is as proficient as possible while still being able to complete its designated tasks. This would mean omitting a GPS tracking system as well as large flight controllers for stabilization amongst other things. One of the main tasks was to detect the health of vegetation. To complete this task, the health of vegetation can be measured by the

Page 4: Senior Design Report

3

amount of light that a plant gives off and these channels can be calculated such as the red, green, and blue near infrared channels. As the sun is feeding a plant, it is providing the spectrum of colors to a plant. The more chlorophyll a plant has, the healthier it is. Also, if a plant has a lot of chlorophyll then it will absorb certain colors and only give of certain colors in the spectrum. Most devices used to measure this are large bulky devices that weigh a lot which is an issue due to weight constraints. To accomplish this, we used a camera that was programmed to pick up the regions of red, green and the near infrared spectrum. This camera can be attached to the drone as it flies and have the data recorded onto an external memory device that can be processed after the drone has completed its task. After processing the images, a color map will be produced that will show the different health levels of the vegetation so it can then be analyzed.

Once conducting the research on the necessary equipment and ordering it, we assembled the drone. After much testing and debugging, an operational drone was constructed. Due to equipment failure and certain time limitations to replace the electronic speed controller, one of the motors was unable to receive a signal from the CrazyFlie (the flight controller of our system). The navigation and collision detecting systems were tested independently of the drone’s structure. The navigation system SLAM was having issues due to the limitations of the Raspberry Pi’s processor that was selected for use. The camera used for vegetation worked as intended but due to a low frame rate, it did not produce clear images. The research that went into this included the team members to delegate individual tasks of focus but collaboratively work together to assure top execution from all individuals. Our team was comprised of 6 members including Jeremy Goldberg (Electrical Engineer), Andrew Blalock (Electrical Engineer), Kimberly Lewis (Electrical Engineer), Colin Cassidy (Electrical Engineer), Wilbert Ramos (Computer Engineer), and Bora Ocbe (Computer Engineer).

Page 5: Senior Design Report

4

1 Introduction

1.1 Problem Description Develop an easy to implement, cost effective, light-weight drone that can detect vegetation that needs to be tended and gather data throughout the environment.

1.2 Significance of the Problem Throughout the technological development history to make agriculture sustainable for mass populations, most have been huge clunky machines that are inherently static. In order create a more dynamic way of tending crops, using special vegetation cameras mounted to an inexpensive aircraft seems to be the most effective way to do this. This technology indirectly benefits everyone because we all eat, but will more directly benefit farmers to cut costs and help reduce bad crops. Technology like this already exists but is not as cost effective as this solution.

1.3 Overview of the Solution The drone that we are developing will communicate over TCP to gather data and use SLAM and ROS to navigate. After gathering relevant data, the drone will return to a larger drone and deliver the data required to show the current status of the crop.

1.4 Goals and objectives The goal is to develop an autonomous drone that will detect vegetation that needs to be tended to and gather data from local stations regarding relevant information. The specific objectives are:

1. Design and implement a low cost, low power and low maintenance drone using an open source flight controller that can be easily reprogrammed in a dynamic environment depending on needs. (Jeremy Goldberg)

2. Design, and implement software drivers to work with the hardware needed to fly the drone. (Bora Ocbe)

3. Install, configure, and optimize a SLAM algorithm on the Raspberry Pi (Wilbert Ramos)

4. Design and 3D print a custom frame as well as any parts needed, tailored to the physical requirements of the components used. (Kim Lewis)

5. Integrate digital output of system and add abstractions to achieve a working flying drone. (Colin Cassidy)

6. Design and find a way to detect the health of the local vegetation of plant life and also find and implement correct sensors to avoid collision detection. (Andrew Blalock)

Page 6: Senior Design Report

5

1.5 Related Research

Research into a monocular SLAM (Simultaneous Localization And Mapping) algorithm suitable for this project found that LSD-SLAM could be a viable solution, having been run on embedded systems such as ODROID. LSD-SLAM was chosen because it could be run in real time and could be used in a Linux environment. More information about LSD-SLAM can be found at http://vision.in.tum.de/research/lsdslam. All crazyflie related research is open to developers at: https://wiki.bitcraze.io/ .The wiki supplied us with the Python API, motor controls, and their open-sourced software. When doing the research for way to detect the health of the vegetation the detecting method that we came across was a method called the NDVI which is the Normalized Difference Vegetation Index. Based on this research we found that plants absorb solar radiation which is the source of energy that is used in the photosynthesis process for plants. Also leaf cells have evolved to scatter solar radiation in the near infrared spectrum because this energy is not sufficient to be used in photosynthesis. Because of this a healthy plant will absorb most of the colors in the RGB color spectrum and release a lot of the near IR spectrum. This will produce an image where healthy plants have a very bright IR view when looking at the IR spectrum. Based on these two measurements of the NIR(Near Infrared Spectrum) and the VIS(Visible Spectrum) they will each produce a value between 0.0 and 1.0. Using the following formula below we can calculate a NDVI value which will show the vegetation health and the higher the number the better the health of the vegetation life.

Based on this formula this can give us a relative measurement of the health of the vegetation of and we can display this is a color map that is shown below.

Figure 1 - Vegetation Filter

Page 7: Senior Design Report

6

2 System Design

2.1 Project Requirements

The broad requirements for the child drone include launching from the mother drone as well as having the ability to navigate autonomously. In addition to such requirements, the child drone needs to be able to gather data from the environment including a sensor . Finally, it needs to be able to return to the mother drone.

2.1. The system shall launch from the mother drone.

2.12. The system shall navigate autonomously.

2.13. The system shall gather environmental data.

2.14. The system shall return to the mother drone.

2.15. The system shall, in the event of an emergency, activate a failsafe

routine.

2.16. The system shall receive mission parameters wirelessly from a ground

control station.

2.17. The system shall receive data from wirelessly from a ground sensor

station.

2.2 Project Requirements Not Met The first project requirement that was not met was the idea to design and 3D print the frame of the drone. There were three different designs for a frame that were thought to be acceptable but once assembling all physical components to any of the three frame designs, there wasn’t sufficient length of the arms for the propellers to clear without hitting the electrical components in the center. There were also extra plates 3D printed in order to be able to stack the various components but this ended up adding a lot of weight. Unfortunately, this became evident with insufficient time to make the required modifications. Thus, it was decided to purchase a full carbon fiber frame that provided everything necessary, due to the time constraints.

2.3 Project Requirements Added In order to rectify the problem of the 3D printer frame not being applicable, a fully

carbon fiber frame was purchase and thus added in the list of requirements. This provided very useful for all components were able to be mounted without any issue.

Page 8: Senior Design Report

7

2.4 System Diagram

Figure 2 - System Flowchart

Page 9: Senior Design Report

8

2.5 State Diagram

Figure 3 - State Diagram

Design meets Requirements: The requirements of the design include using a specially filtered camera to detect vegetation and gather data through a sensor station over wifi. The vegetation camera is USB based and is easily saved on the local SD card on the Pi. The original plan was to use the vegetation camera for SLAM and detecting vegetation, but due to the limitations of the camera, a second fisheye lense was used that could handle a higher FPS. The initial design for the PCB called for 1 voltage regulator, but due to over-current, another regulator had to be used to support the 3.3v rail for the signal amplifier and crazyflie. Ideally, the design and biasing of our own voltage regulator would create better efficiency, however due to time constraints, a pre-built board was implemented. See Figure 8. The battery that was designated for the design powers all voltage rails and thus simplifies the design by regulating the voltage and current mentioned above. 5 inch, 3 - blade propellers were used to generate the max thrust possible.

Page 10: Senior Design Report

9

Mechanical and Electrical design: The ideal parameters for physical dimensions are 6” x 6”. Everything will be mounted in layers with an even center of balance. The final design incorporates a fully carbon fiber frame that is just outside these parameters 7.91” x 6.3”. The custom designed PCB allowed the use of 1 battery for 3 different voltage rails (3.3v, 5v, 7.4v). These parameters are required to power the flight controller (crazyflie 2.0), Raspberry Pi, and the ESC. The ESC is used to control the 3 phase motors. There is custom firmware that is preloaded to output the correct waveforms. Due to the complicated aspects of these motors, default specifications were used, which was sufficient for control. The flight controller itself comes with a prebuilt app for manual control which was used for prototyping. It is controlled by an STM processor that was connected through radio to a Raspberry Pi. The Pi has Robot Operating System loaded to incorporate automation. Using feedback from various sensors (magnetometer, barometer, accelerometer, gyroscope, ping) and the SLAM camera system, the child drone is autonomous and can be incorporated into a dynamic environment.

3 System Implementation

3.1 Hardware Components selected to implement each of the sub-systems identified in Section 2.2. Parts for mechanical and electrical design. Description of mechanical design.

1. PCB Make and Model: Custom Designed Functional Requirements: Our project needs 3 different power rails. Specifically 3.3v, 5v, and 7.4v. In order to have everything in a small package that will save space and weigh very little, designing a PCB to incorporate the power rails and LLC to amplify the 2.8v pulses coming from crazyflie to the new motors would be the most effective way to proceed. Component Description: The PCB was designed in eagle cad. The components that were used are as follows: Standard DC-DC Buck Converter ( 7.4v à 5v for Pi Power source [ can output up to 3A ] ), LLC (amplifies pkpk output to motors using 4 mosfets ), Resistor Network to power 3.3v rail for crazyflie, Banana plugs ( to deliver 7.4v to buck converter and ESC ). Testing and Verification: After the PCB was milled, components were fitted and soldered down. In order to test for a good connection, DMM was used to ohm out the connections to ensure they were soldered correctly. Next, the power supply was used to test all the power rails. A 7.4v source that applied at the banana connections and a DMM was used to measure the voltage at the respective points.

Page 11: Senior Design Report

10

Figure 4 - PCB Design

Figure 5 - PCB Final Product

2. ESC ( Motor Speed Controller ) Make and Model: Favourite 4 channel ESC w/ 12a output Functional Requirements: Brushless motors require a 3 phase input, therefore a speed controller is required. Component Description: The firmware in the controller modifies the pulses coming in as pulses and outputs the signal as 3 separate signals to feed the 3 phases such that they are output in succession. Due to the magnetic nature of the motors, the timing on the outputs have to have a small margin of error. This is all taken care of by the firmware preloaded on the unit. It however, can be reprogrammed. Testing and Verification: Using the proprietary interface for the speed controller, I verified all channels have the same specifications to ensure stability. When connecting the ESC to the motors, there is no specific color code, so ensuring motors 1-4 were spinning the correct directions ( CW or CCW ), I used

Page 12: Senior Design Report

11

trial and error. It’s possible to reprogram the ESC if the motor spins the wrong way, but there’s no point considering its also possible to invert 2 of the 3 inputs to change the direction. I chose this approach.

Figure 6 - Speed Controller

3. Logic Level Converter (“Motor Input”) Make and Model: SparkFun Logic Level Converter - Bi - Directional Functional Requirements: Knowing the functionality and output of the crazyflie and input of the ESC and motors showed that alone the 3.3V output pulse of the crazyflie was not enough to drive the brushless motors selected. The ESC requires a 5V input pulse to drive the motors. The approach to this problem knowing that pulses can be represented in a digital sense was to see this as a simple logic level discrepancy. By understanding that the amplitude of a square wave had to be increased, the team understood that a mosfet level converter could be used to accomplish this. The digital logic converter allows the ESC to be driven with the 5V square wave input pulses. Component Description: provide a brief technical description written in your own words (do not take text from the vendor description). Based on the component specification (data sheet) describe how this component meets your project requirements. Testing and Verification: We conducted a simple pulse train test in the electronics labs using a Tektronic AFG 3021 Single Channel Function Generator showing that the LLC can correctly take a 3.3V pulse and step it up to a 5V pulse that can drive the motors. The team was able to upload a video on Instagram to show the testing process. https://instagram.com/p/3fXPyJHXM_/

Page 13: Senior Design Report

12

Figure 7 - Signal Amplifier

4. Voltage Regulators (Circuit Biasing, current protection) Make and Model: N/A Functional Requirements: Using a resistor network creates an inherent problem of using small value resistors to that allow the proper current to drive the various components subjecting the the system to fluctuations and possible burnout of the resistor network. A voltage regulator is required to have the crazyflie function correctly while simultaneously allowing enough current to power the systems and future abstractions that could be added. Obviously the 7.4V battery is too much to get the crazyflie to work. By using a voltage regulator we can drop the voltage down enough to get the crazyflie to work. Component Description: The component itself is a small PCB with various electrical components on it such as the TI LM2596 Buck Converter, potentiometer, and filtering capacitors. The potentiometer is used to adjust the voltage up or down. Testing and Verification: The voltage regulator has several functioning parts but the potentiometer is the main device used to take the input voltage and adjust it to a higher level or lower level. Testing the component was as simple as tying the voltage regulator to a breadboard, using a power supply to represent our battery and take measurements using a DMM from the labs.

Page 14: Senior Design Report

13

Figure 8 - Buck Converter 1. Custom Frame Make and Model: Custom Frame using an AirWolf HD2X 3D printer Functional Requirements: The many requirements needed of the frame for decent performance, as well as functionality, made custom design a viable choice. Originally, the best option was to custom design the frame considering the convenience of possibly altering design if necessary. It was also the best option since there were many specifications such as weight management, multiple components and stability. Component Description: The material used in the print was acrylonitrile butadiene styrene (ABS), a common thermoplastic that is preferred when 3D printing due to it’s strength, flexibility, and high temperature resistance. Testing and Verification: Physical prints were needed for testing of the frame. This included printing 3 different prototypes all using the same material. The first prototype printed included the following dimensions: 0.25 in. thickness, 1 in. width arms, 0.50 in. radius ends, 0.13 in. diameter of extruded holes for motor placement. To give the team an idea of what was desired throughout the process, this prototype was printed. It actually ended up being the most successful. The second printed frame kept most of the same dimensions but modified the thickness of the frame from 0.25 in. to 0.15 in. for less weight for motors to support. We encountered errors with the legs that were added during this print to constrictions of size. For the third prototype, these dimensions were enlarged and plates were printed as well for the layering of physical components (ESC, RaspberryPi, CrazyFlie, battery). When assembling all physical components to the latest frame, it was found the arms needed to be extended and altered due to lacking clearance from propellers. It was then decided to invest in a larger carbon fiber frame providing both clearance for the propellers and storage for various components.

Page 15: Senior Design Report

14

Pictured below is a rendering of the first 3D printed prototype.

Figure 9 - Original Frame Design

Pictured below is a rendering of the second 3D printed prototype.

Figure 10 - Enhanced Frame Design

2. Carbon Fiber Frame Make and Model: YKS DIY Full Carbon Fiber Mini C250 Quadcopter Frame Kit for FPV Mini Quadcopter Part Functional Requirements: Due to time constraints and unsuccessful attempts to 3D print the frame with the time allocated, it was decided to invest in a carbon fiber frame for the time being to test for flight. This frame is not only durable and small enough in length constraints but is also light and has much room for expansion which is desirable for the project. Component Description: The new carbon fiber frame includes four arms and multiple layers as well as a center with holes available for component placement.

Page 16: Senior Design Report

15

It is made entirely of carbon fiber making it very light but still holds the integrity of the frame’s strength. The benefits of having ordered this frame is our selection in how many screws, plates, and added tools we want to use or exclude for weight management. Frame also includes legs for stand alone purposes. The total weight of the frame, including the CrazyFlie, ESC, Raspberry Pi, connections, motors, propellers and sensors is 530 grams. The goal is to have a 2 to 1 ratio of thrust to weight. The overall thrust is 800 grams so 530 grams is a decent value. Testing and Verification: For testing and verification, it is established that the frame is able to store all electrical components and devices.

Figure xx - Final Product

3. Motors Make and Model: SunnySky X1306S 3100 Kv Brushless Multi-rotor Motor Set (Including 2 Motors) Functional Requirements: The motors were chosen to provide enough thrust to lift the frame and all components the frame carries. The calculations made were indicative to this being the best selection for motors to use. Component Description: A physical description of the motors includes: Motor weight: 11.5g , Stator Diameter: 13mm , Stator Length: 6mm , Overall Diameter: 17.7mm , Length (including shaft): 27.4mm. Neodyium magnets and polyesterimide enameled oxygen free copper wires that work up to 180 degrees C. Testing and Verification: In order to verify the functionality of the motors prior to implementation, used a signal generator to simulate the CrazyFlie into the ESC to test the motors. Tested 50-60 Hz frequency ranges and saw there was no noticeable difference.

Page 17: Senior Design Report

16

Figure 10 - x1306s SunnySky Motor

4. Crazyflie Make and Model: Bitcraze: Crazyflie 2.0 Functional Requirements: The crazyflie is needed to control our motors, while providing an interface for automation. The crazyflie also has a built in IMU, which saves the need of buying additional sensors. Component Description: It has 4 built in motor adapters, which we use to sodor our own Motors. There is a built in Radio module coupled with a full API to control the drone from the Raspberry Pi. There is also a built in IMU, which houses an accelerometer, magnetometer, and barometer. Testing and Verification: The crazyflie sensors has been tested by the connecting to our PC’s and running the desktop client. This logs all the sensor information. The we tested the Radio module by running a flight script from our PC to our crazyflie. We tested the motor connections by connecting them to an oscilloscope to measure the output. 5. Raspberry Pi Make and Model: Raspberry Pi Foundation: Raspberry Pi 2 Model B Functional Requirements: The Raspberry Pi is needed in order to process and store data, in addition to localizing and mapping the environment around the child drone. The Raspberry Pi is responsible for navigating the drone and sending commands to the crazyflie. Component Description: The Raspberry Pi has 4 USB2.0 ports, 40 GPIO pins, an HDMI Port, an ethernet port, a combined 3.5mm audio and composite video port, Camera and Display interfaces, and a VideoCore IV 3D graphics core. By default, the Raspberry Pi runs at 900 MHz, with 1 Gb of RAM. Testing and Verification: The Raspberry Pi has been tested by installing the necessary Operating Systems and drivers for all the devices that we plan on using. We also measured the speed and latency of the Raspberry pi using utilities pre installed in the operating system to view the efficiency of the pi

Page 18: Senior Design Report

17

Figure 11 - Raspberry Pi 2b

6. Fisheye Camera Make and Model: Raspberry Pi Foundation: Raspberry Pi 2 Model B Functional Requirements: The Fisheye Camera was chosen since it was an ideal candidate to be used in our SLAM algorithm. The attributes that were most desired included the ultra wide Field Of View (FOV) the camera gave us, in addition to its USB capability. By using a wide FOV, SLAM was able to gather more information about the environment and could better track the points it needed to complete localization. Component Description: The Fisheye camera has a 180 degree FOV and delivers a 2 Megapixel image. It also delivers a 60 fps video with 1280X720 resolution, or a 30 fps video with 1920X1080 resolution. Testing and Verification: To test the camera, the GNOME drivers were installed onto the OS, and gstreamer-properties was run to grab a video stream. To calibrate the camera, we used the cameracalibrator package provided by ROS so that we could generate a configuration file with metadata describing the features of the camera, including distortion, resolution, and FOV. Finally, LSD-SLAM was used to determine if the camera could efficiently grab information from a scene and not lose frames easily during movement.

Page 19: Senior Design Report

18

Figure 12 - SLAM Fisheye Camera

7. Infragram Camera Make and Model: Raspberry Pi Foundation: Raspberry Pi 2 Model B Functional Requirements: The infragram camera is a webcam with its IR filter removed from the photo sensor so that it can capture information from Normal and InfraRed wavelengths. By capturing IR wavelengths, we can process the image to give us more information about the vegetation we are taking photos of. this can let us know how well the plant is photosynthesizing. Component Description: The Infragram Camera can deliver 1600x1200 Images with Infrared and Normal Wavelength light minus the red channel instead of that we have the near IR channel. Testing and Verification: To test the camera, the GNOME drivers were installed onto the OS, and gstreamer-properties was run to grab a video stream. We also grabbed images to verify if they could be post processed to give us more information about vegetation health.

Page 20: Senior Design Report

19

Figure 13 - Vegetation Filtered Photo

Figure 14 - Vegetation Camera

3.2 Software Describe the software design. The software is abstracted between 2 layers. The Automation layer, and the Controller layer. The crazyflie is where the controller layer is, while the Raspberry Pi is where the automation layer is location. The controller layer works as follows:

1.) at Boot, a task will be created for each module in the system 2.) It will load our own brushless drivers 3.) It will wait for input from the radio controller

The automation layer:

4.) We will have sensor script running to constantly read data 5.) LSD-SLAM will run as a service for the camera through Robotic Operating

System (ROS).

Page 21: Senior Design Report

20

6.) There will be code to grab any data from ground sensor station before launch 7.) Our main automation code will then run; it will grab data from our camera and

sensor and determine where it is currently located and where it needs to go. 8.) The navigation will be sent over Radio to the crazyflie controller.

Simultaneous Localization And Mapping (SLAM) is an ideal robotic mapping algorithm that allows us to approximate the position of the drone in an unknown environment in real time. Seeing as how we were restricted to a single camera, we looked for a SLAM implementation that gathers information from the environment using monocular vision. Another feature of SLAM was loop-closure, or rather the ability to recognize previously mapped landmarks in our environment to accurately restructure our map. We finally settled on LSD-SLAM because it fit our requirements and was open source, so adjustments could be made to tailor it to our platform. LSD-SLAM works by using the main camera to map and localize through image intensity. The algorithm also forms a pose graph, or a 3d model of the environment, which can be post processed afterwards. LSD-SLAM was tested on Android Smartphones to see if it was viable for Augmented Reality (AR) and therefore has been proven to be relatively computationally lightweight. In the pointcloud image below, we can see the pointcloud before loop closure (left) and after loop closure (right). By closing our loops, we can obtain a better map. In addition, this can allow our drone to ‘know’ how to return to its starting location.

Figure 14 - LSD-SLAM Pointclouds (Top) and Depth Maps (bottom)

All software used for the crazyflie is Free Use given to us by Bitcraze. We added our own code to their drivers to implement the Brushless motors. The sensor code is written by us, it will run as a startup service, which runs in the background. LSD-SLAM can be used under an open source license, along with ROS, and can be run in the command line either automatically or remotely.

Page 22: Senior Design Report

21

3.3 User Interface There is no User Interface.

3.4 Data Communications We have a wifi dongle attached to the Raspberry Pi through USB. We set up their private network (wifi-PI) to connect automatically. Then we run their script which will download all relevant data to our SD card, in a folder called Data. This will run before launch, so we do not need constant connection to ground sensor station.

3.5 Testing We used a JTAG debugger, which allows us to read the Crazyflie’s firmware in real-time to help with testing our drivers. Bitcraze also has a flight controller desktop client. This was used to test controlling the drone from the Raspberry Pi. For SLAM, we could test the algorithm by mapping the immediate environment around the pi with the camera. This allowed us to verify the capability of the pi and the algorithm in collecting data through various camera movements. In addition, we were able to load videos onto the pi and stream them into the SLAM algorithm to benchmark the algorithm and evaluate its effectiveness from there. Both of these methods should have also produced a pointcloud object which could easily be translated into a 3d model. For testing the electronics required to make the project work, a DMM, signal generator and oscilloscope was used and individual methods were mentioned in section 3.1.

Page 23: Senior Design Report

22

4 Budget Include a table showing the part list, price of each part, and total project costs.

Component Description

Purchase Link in Online Store Number of Units

Unit Price Total

Crazyflie 2.0 http://www.seeedstudio.com/depot/Crazyflie-20-p-2103.html?cPath=84_147

2 $215.00 $430.00

Raspberry PI http://www.amazon.com/Raspberry-Pi-Model-Project-Board/dp/B00T2U7R7I/ref=sr_1_1?s=electronics&ie=UTF8&qid=1429195192&sr=1-1&keywords=raspberry+pi+2+model+b

2 $42.50 $85.00

WiFi Dongle http://www.amazon.com/gp/product/B003MTTJOY/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A2N1S6D8VVCNZ3

2 $10.00 $20.00

SD card http://www.amazon.com/SanDisk-Memory-Adapter-SDSDQUAN-032G-G4A-Version/dp/B00M55C0NS/ref=pd_bxgy_pc_img_z

2 $16.50 $33.00

Motors http://www.amazon.com/SunnySky-X1306S-Brushless-Multirotor-Motor/dp/B00R2DOKKE

4 $39.00 $156.00

Electronic Speed Control

http://www.hobbyking.com/hobbyking/store/__69353__Favourite_Sky_3_Quattro_4_x_12A_Brushless_Quadcopter_ESC_2_4S_1A_BEC_BLHeli.html

1 $30.00 $30.00

Battery http://www.amazon.com/Duratrax-LiPo-Onyx-7-4V-2000mAh/dp/B007KMWPXQ/ref=sr_1_2?ie=UTF8&qid=1429125820&sr=8-2&keywords=7.4v+2s+25c+2000mah+lipo

1 $23.50 $23.50

Voltage Regulators

http://www.amazon.com/Retailstore-LM2596-Adjustable-

4 $8.00 $32.00

Page 24: Senior Design Report

23

Supply-Converter/dp/B009HPB1OI/ref=sr_1_1?ie=UTF8&qid=1429300274&sr=8-1&keywords=dc-dc+step+down+voltage+converter

Logic Level Converter

http://www.amazon.com/gp/product/B00TMILEWY?psc=1&redirect=true&ref_=oh_aui_detailpage_o03_s00

3 $7.00 $21.00

Frame http://www.amazon.com/gp/product/B00WE9EGAA?psc=1&redirect=true&ref_=oh_aui_detailpage_o01_s00

1 $40.00 $40.00

Bullet Connectors

http://www.amazon.com/gp/product/B00EK96TLQ?psc=1&redirect=true&ref_=oh_aui_detailpage_o01_s00

1 $5.70 $5.70

Infragram DIY Plant Analysis Webcam  

1 $55.00 $55.00

Ping Sensors http://www.amazon.com/Parallax-Ultrasonic-Range-Sensor-28015/dp/B004SRTM0K

8 $29.50 $236.00

USB Fisheye Webcam

http://www.amazon.com/gp/product/B00LQ854AG?psc=1&redirect=true&ref_=oh_aui_detailpage_o02_s00

1 $45.00 $45.00

Traxxas Battery Connector

http://www.amazon.com/gp/product/B005EFZMQI?psc=1&redirect=true&ref_=oh_aui_detailpage_o07_s00

1 $7 $7

Total $1219.20

Page 25: Senior Design Report

24

5 References J. Engel, J. Stückler, and D. Cremers, “LSD-SLAM: Large-Scale Direct Monocular SLAM,” Computer Vision Group. [Online]. Available at: http://vision.in.tum.de/research/lsdslam. [Accessed: 2015].

Infragramby Public Lab (Infragram: online infrared image analysis)  http://www.infragram.org/ LSD-SLAM: Large-Scale Direct Monocular SLAM (J. Engel, T. Schöps, D. Cremers), In European Conference on Computer Vision (ECCV), 2014.

Measuring Vegetation (NDVI & EVI) : Feature Articles (Measuring Vegetation (NDVI & EVI) : Feature Articles)  http://earthobservatory.nasa.gov/Features/MeasuringVegetation/ “Raspberry Pi - Teach, Learn, and Make with Raspberry Pi,” Raspberry Pi Home Comments. [Online]. Available at: https://www.raspberrypi.org/. [Accessed: 2015]. “ROS.org | Powering the world's robots,” ROSorg. [Online]. Available at: http://www.ros.org/. [Accessed: 2015]. Semi-Dense Visual Odometry for AR on a Smartphone (T. Schöps, J. Engel, D. Cremers), In International Symposium on Mixed and Augmented Reality, 2014. Semi-Dense Visual Odometry for a Monocular Camera (J. Engel, J. Sturm, D. Cremers), In IEEE International Conference on Computer Vision (ICCV), 2013.