embedded control systems project cooperative control · traverse along a closed track. the track is...

18
Embedded Control Systems Project - Cooperative Control 22 May 2012 By Amardeep Mehta Amir Motevakel Razee Hussein-Jamal Tom Homewood Vishnuvardhan Avula

Upload: others

Post on 29-Dec-2019

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

Embedded Control Systems Project

- Cooperative Control

22 May 2012

By

Amardeep Mehta

Amir Motevakel

Razee Hussein-Jamal

Tom Homewood

Vishnuvardhan Avula

Page 2: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

2

Table of Contents

1. Introduction ........................................................................................................................................ 3

2. Modelling and simulation .................................................................................................................. 3

3. Line tracking - Leader ......................................................................................................................... 6

4. Vision and formation control – Followers ......................................................................................... 6

4.1. Positioning and navigation by machine vision ............................................................................. 6

4.2. Vision sensor ................................................................................................................................ 7

4.2.1. NXTcam v3.0 sensor ......................................................................................................... 7

4.2.2. Coulrmap .......................................................................................................................... 7

4.2.3. Colormap Structure in NXTCam ....................................................................................... 8

4.3. Real-world limitations of camera sensor and solutions ............................................................... 8

4.3.1. Intensity perception in human eyes vs. camera .............................................................. 8

4.3.2. Effect of fast moving on captured image ......................................................................... 9

4.3.3. Dispersion ...................................................................................................................... 10

4.4. Software techniques to increase the performance of the camera ............................................ 11

4.4.1. Make readings of the angle and distance independent ................................................ 12

4.4.2. Correlating size of object to distance ............................................................................ 12

4.4.3. Tracking the object ........................................................................................................ 14

4.5. Main controller .......................................................................................................................... 14

4.6. Possible improvements for vision .............................................................................................. 14

5. Communication ................................................................................................................................ 14

5.1. Packet format ............................................................................................................................. 15

5.2. Example transmission ................................................................................................................ 17

5.3. Transmission of messages .......................................................................................................... 17

5.4. Reception of messages .............................................................................................................. 17

5.5. Performance ............................................................................................................................... 18

6. Conclusion ........................................................................................................................................ 18

Page 3: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

3

1. Introduction

This report is part of the project titled “Cooperative Control”, which was carried out during the course

“Embedded Control Systems Project”. In close correspondence with the title, the project deals with a

group of cooperating units whose prime objective is to maintain a geometric formation, as rigid as

possible, while they traverse along an arbitrary track.

The cooperating units used here are triplets, which are wheel-based robots and built on easy-to-

assemble Lego Mindstorms NXT. A triangle has been chosen as the formation that needs to be

maintained. One of the robots serves the role of a dedicated leader as the other two follow it along a

chosen track. The leader is a line tracking robot that uses a light-sensor to helps lead its way along the

track. Whereas, the followers are equipped with cameras that lock onto a marker carried by the leader.

The followers engage themselves in a recurrent control algorithm that maintains its position in the

formation taking note of the camera readings. In addition, it relies on certain signals from the leader

suggesting the beginning and end of a curve along the track. This information is useful in maintaining

the rigidity of the formation at the curves to some extent. All inter-robot communications are

Bluetooth based. The software is developed using LeJOS with the aid of Eclipse IDE. The modelling

and simulations were done in MATLAB.

The project has been realized using the ideas conceived from a multitude of subjects such as

automatic controls, wireless communication, modelling and simulation and programming of

embedded systems.

2. Modelling and simulation

Stage I: Implementation of basic state space equation

Following differential equation [1] governs the angular movement of robot:

– (1)

where , = radius of Wheel, = distance between the wheels. Putting the value of in the equation results as:

( )

( )

– (2)

Seperating these equation we get:

– (3)

and similarly

– (4)

As translational velocity is the average velocity of left and right wheel, so

– (5)

Keeping the value of v from equation [5] to the equation (3+4) we get: – (6)

Page 4: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

4

Now the state space representation of equations (1) and (6) can be given by:

[

] [

] [

] [

] [

] -- (7)

This state space representation is for both master and follower.

Stage II: This stage of the follower can be designed using following differential equation [2]

of ( ) control problem.

In our case if we suppose the camera is mounted on the top of the wheel axis then, and

(assumption)

So the above equation becomes the following:

– (8)

To represent the above equation in state space form, we need to linearize the above equation:

[

] [

] [

]

[

] [

]

Page 5: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

5

Now we can design a PI controller after computing the error in and by following

equations

– (10)

Now the input to the follower can be modified using the above errors as

and

The error taken into account is the error due to relative distance and not orientation. After

simulation the following results were obtained. The first two figures represent the graph

between x and y positions versus time for the Master. The next two are for the follower using

PI controller.

Page 6: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

6

3. Line tracking - Leader

One of the controllers incorporated into the leader is the line tracker. It enables the leader to

traverse along a closed track. The track is printed on a white chart and it gradually gradates

from white to black along its width. The tracking is achieved by implementing a PID

controller. As input, it reads the reflected light from a passive light sensor fitted to the front

portion of the robot. The read value is then interpreted in terms of deviation from the centre

of the track. The relation between light intensity and deviation for the chosen track is shown

in the below figure.

The line tracking controller then tries to make this deviation as much close as possible to null.

It reads the error signal in the form of track deviation and computes the control signal. The

control signal is a differential voltage that drives the left and right motors connected to the

front wheels. Thus, the leader adjusts its position and ensures that it remains in the centre of

the track. The leader carries a marker in order to enable the slaves to reorient their positions

relative to the master. The leader also has the ability to detect the curvature and communicate

this information to the slaves. However, it communicates end of curve information to the

slaves after traversing a certain distance out of the curvature.

4. Vision and formation control – Followers

4.1 Positioning and navigation by machine vision

For indoor positioning of the robots without incorporating any external device a few types of

sensors could be used. Digital compass, Accelerometer, Camera and Ultrasound distance

sensor.

Page 7: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

7

1. Digital compass: The direction of robot and the amount of movement can define the

position of the robot at each moment of time. The drawbacks with this method are digital

compass needs calibration every time the environment changes and also maximum accuracy

of the compass is a source of error.

2. Accelerometer: By measuring acceleration along each axis the movements toward same

axis, position can be measured.

Both mentioned methods can be categorized as dead reckoning method and although provide

a smooth estimation of the position, it will drift over time as the small errors accumulate.

3. Ultrasonic distance sensor: It has limitation of maximum distance and also needs an

almost perpendicular surface to reflect the waves to the sensor and not any other object

around. But with the goal of getting more reliable and accurate reading of the distance by

fusing the data from 2 different sources this sensor added to the robot. This necessity was

tangible at the early phases of the project when the camera used to loses the target but also

when 2 robots are so close, the camera would be out of focus so this sensor used to have a

measurement from 0 cm. Although in practice robots never get that much close, this feature

is just for the situations when something doesn't function as expected due to hardware

problems.

Because of mentioned reasons, mainly the camera has been selected to define the position of

the followers with respect to leader. The projected size of the object is proportional to the

distance and by having the actual size of the objects, the distance can be calculated. There are

also some challenges which will be discussed later.

4.2 Vision sensor

The main source of input information for followers is an NXTcam sensor. This gives the

distance between 2 robots and relative bearing. This sensor communicates with outside world

through the I2C protocol.

4.2.1 NXTcam v3.0 sensor

During offline operations, such as programming and configuration, NXTCam must be

connected to PC (using USB cable) as well as NXT (using standard NXT connector cable)

while NXT is powered ON. During run-time (or autonomous) operations on NXT, the USB

connection to PC must be removed.

While NXTCam is connected to NXT as well as PC, the PC communication takes priority

over any other communication.

4.2.2 Colormap

The objects of interest are recognized by NXTCam by matching the stored color values with

the captured image. To do that, color values of the objects of interest need to be stored in

NXTCam’s memory. These color values are known as Colormaps. NXTCam can store up to

8 Colormaps and provide processed information of the objects matching those Colormaps.

Page 8: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

8

4.2.3 Colormap Structure in NXTCam

The colormap is a 48 byte buffer in NXTCam memory. From this buffer each of RGB color

is assigned 16 bytes for that color. i.e. first 16 bytes are for Red, next 16 for green and

remaining 16 for blue. These 16 bytes store matching preference for that color's absolute

value ranges as 0-16-32-48-64-80-96-112-128-144-160-176-192-208-224-240-255 i.e. first

byte is set for color range 0 to 15, second byte for range 16 to 31, third byte for 32 to 48 and

so on. Each bit in the byte contains mask for each object (i.e. 8 objects) which is set to 1 if

you want matching object.

4.3 Real-world limitations of camera sensor and solutions

4.3.1 Intensity perception in human eyes vs. camera

Our eyes do not perceive light the way cameras do. With a digital camera, when twice the

number of photons hit the sensor, it receives twice the signal (a "linear" relationship). But

instead, we perceive twice the light as being only a fraction brighter — and increasingly so

for higher light intensities (a "nonlinear" relationship).

Actual perception will depend on viewing conditions, and may be affected by other nearby

tones. For extremely dim scenes, such as under starlight, our eyes begin to see linearly like

cameras do.

Compared to a camera, we are much more sensitive to changes in dark tones than we are to

similar changes in bright tones. There's a biological reason for this peculiarity: it enables our

vision to operate over a broader range of luminance. Otherwise the typical range in brightness

we encounter outdoors would be too overwhelming. Mentioned phenomena can be observed

in Blue ball images:

Page 9: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

9

Uniform high Lighting condition Directional and poor lighting condition

Problem which these significant changes can cause is massive color temperature variation

which in turn leads to detected object position and size variations and detection of fake

objects.

4.3.2 Effect of fast moving on captured image

As it can be seen that jerky moves has a negative effect on captured image and changes the

size and spectrum of color as well so it is important to have a relatively smooth control.

Page 10: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

10

4.3.3 Dispersion

The NXTcam tends to disperse the light so much. This leads to fake recognition of other objects as

objects of interest, because of having same color temperatures.

As it is noticeable in the next images, the real image (upper image) doesn't have and red color

other than the red target itself but due to dispersion, NXTcam detected 5 more red objects

which marked with the purple marks.

Page 11: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

11

4.4 Software techniques to increase the performance of the camera

1. The main problem in any object recognition application is either detection of fake

objects in case of less sensitivity or fragmentation if the whole object due to high

sensitivity settings. One of the software tricks which has been applied is sorting the

objects based on their size then pick the biggest one. This reduces the chance of

grabbing a wrong object for position calculation.

2. Ironing the noises by Kalman filter. Due to quantization error most of the time the

boundary of the detected object changes. To suppress this jump around problem a

version of Kalman filter without prediction has been used. By having one function

both feedbacks of the distance and bearing error pass through the filter before feeding

to the controller. This can make the system more robust against other type of noise

and missed frames of video because of sudden blocking of sight, lens, flare, missed

packet in communication or other possible reasons.

Page 12: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

12

3. Selecting a bold target: to increase the viewing angle of the follower's cameras. The

object has the shape of half a cylinder. It provides a viewing angle more than 180

degree.

4. Selecting a narrow and tall pattern as the target to gain maximum margin on the sides

and maximum accuracy for the distance which is based on the height.

4.4.1 Make readings of the angle and distance independent

Since the reading of the distance should be isolated from angle of reading, the height of the

target selected as distance criteria. The horizontal position of the object in the screen shows

the bearing of the target. Because with the changes in the projected size of the object, the

distance from the edges which is the reference for the deviation measurement changes,

measuring of the deviation must be independent of the distance. This has been done by first

calculating the width and then subtracting it from the whole screen width and then to make

the distance of the object from either edges of the screen equal.

4.4.2 Correlating size of object to distance

The variation in the projected size of the objects doesn't change linearly with respect to

distance. The figure below shows the distance-projected size curve based on measurement.

The vertical axis is the projected size of the object and horizontal axis is measured distance in

mm. Since the distance between the camera to the front bumper is about 10 cm, object

couldn't be closer.

Page 13: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

13

The range has been divided into 5 regions (70 – 57, 57 – 46, 46 – 37, <37) and the line

equation for each region has been extracted. These equations applied in the distance

calculation function. (dist = (float)(dist_a0[i]*ht+dist_a1[i]);)

Distance correction:

The position of the camera is away from center of rotation. This difference is maximum when

the object is in the front of the camera and gradually decreases when it moves to the sides.

This error has been compensated to avoid any further error.

Angle correction:

The center of the robot rotation is the middle point of front axle and the steering control is

based on this point but the camera located further to the back side. This would introduce an

error when the camera looks at an object far from straight viewing angle. Consider the image

below:

By applying triangle equation, the angle ‘a’ is calculated and extract from the reading angle.

Page 14: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

14

4.4.3 Tracking the object

In case of both moving target and moving observer the stability margin of the system tends to

decrease so much. This becomes even narrower when the object is close and would reach the

edges of the screen easier. This problem was the main reason to incorporate a 1-DOF Camera

(Panning) which always keeps following the target independent of robot chassis orientation.

Since the dynamic of the camera moving mechanism is much faster than the system it has

become almost impossible, camera loses the target due to mentioned constraints. After

centering the target in the camera screen, the angle of the camera motor feeds to the main

controller as the error of the angle. This process happens simultaneously which resembles

functionality of a buffer for the main controller and lets the main controller to avoid jerky

moves by giving it more time to react. In the model with fixed camera, those sudden moves

caused by impulsive reaction of the controller, had affected functionality of the whole system

since it reflects back to the systems through the feedback, in this case the vision.

It is easily possible to have a 2-DOF camera by adding same function for the second axis, in

order to take care of height as well in case of flying robots.

4.5 Main controller

The main controller consists of two coupled PID controllers for correction of deviation of

both angle and distance. The error signal form 2 mentioned translate into one control signal

since the low level commands to the wheel's motors supposed to regulate both errors at the

same time yet independently. In other word if the distance was as set value but the leader was

not at the center of the follower sight, it can adjust this without changing the distance and

vice versa.

The follower robots wait for the curve detection signal issued by the leader. On detecting this

signal, the slaves’ cameras track the leader for a present distance. Then, they lock to the last

read camera angle and use it to derive a new camera reference angle. This continues until the

end of curvature signal is issued by the leader.

4.6 Possible improvements for vision

1. Using a bright target with a narrow color temperature rather than reflective objects in order to isolate the received light from ambient light.

2. Incorporating a more sophisticated camera with light-balance, white-balance and zooming capabilities.

Source: http://www.cambridgeincolour.com

5. Communication

As detailed in the previous section, the robots communicate via the Bluetooth RF standard.

Once the devices are paired, connections are initiated between the robots, on a point to point

basis. Inbound and outbound connections are represented by the DataInputStream and

DataOutputStream objects respectively. These streams provide a simple way for the robots to

communicate, but they provide little in the way of addressing or checking data integrity.

Page 15: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

15

In an ideal world, every message sent from one robot to another would arrive at the correct

destination with all its data intact. Unfortunately, this is not the case, and is a particular

problem when working with any form of wireless interface. Although it is unlikely that a

message could be delivered to the wrong recipient as we are working with point to point

streams, it is still possible that a programmer error or hardware bug could cause such a

problem. Additionally, since more than one variable is sent between the robots, the variables

need to have names associated with them, in order that the message recipient knows what to

do with the value.

Therefore, a protocol that the robots could use to communicate with each other was proposed

and implemented: the Inter-Robot Communication Protocol (IRCP). This provides both

address checking and data integrity, and supports the sending of multiple variables in one

message, in order to reduce transmission overheads.

5.1 Packet format

The IRCP supports sending of multiple data values in one packet to reduce overall latency.

The following example shows the structure of a 9-field message, used to carry two variables

(variable names are contained in Bytes 5 & 8, and values are in Bytes 7 & 10). A single-

variable message would have the same structure, but lack fields 8, 9 and 10. Further variables

can be added indefinitely, each variable requiring three fields (name, type and value). All

fields are represented using single bytes, with the exception of the data values themselves,

which can vary in size from one to four bytes depending on the data type.

Field 1: Destination ID

Field 2: Source ID

Field 3: Priority

Field 4: Number of data fields

Field 5: Data field name i

Field 6: Data field type i

Field 7: Data field value i

Field 8: Data field name i+1

Field 9: Data field type i+1

Field 10: Data field value i+1

Field 11: Message checksum

The destination ID is the robot ID of the intended recipient. These IDs are unique and range

from 1 to 3 in our system (but this could be extended to any number in the future). Whilst this

will not always be needed, there is a small chance that due to either programmer error or a

possible problem with the Bluetooth system will mean a message is delivered to the wrong

Page 16: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

16

node. Therefore this field is included, so that it may be checked by software. This avoids the

possibility of a message being processed by a robot that it was not intended for, which could

have a negative effect on the system, as all three robots are trying to remain in formation.

The source ID is used to identify the sender of the message, and is important as the master

robot will be receiving messages from two slaves, and will need to be able to determine

which robot sent the message. This is important in any system with more than two devices.

The priority field ranges from 0-9, with 9 being the highest priority. The means that, at the

master, an important packet from slave 'a' will take priority over a less important message

from slave 'b'. This was originally included to reduce the time that important messages are

kept waiting behind less important ones. However, as the project progressed, it became clear

that a) There were not so many messages to be transmitted as first anticipated, and b) The

delay associated with sending a message was fairly small, so the performance gain resulting

from using a priority-based system is negligible. This field has, however, been preserved, so

that if the project is to be developed in the future, it may be utilised.

Field 4 simply indicates how many variables are contained within the packet. This is used by

the receiver during message processing.

Fields 5 -> N-1 are the name - type - value triples for each variable. The variable name may

only be one character, but this still allows the transmission of 62 alphanumeric characters,

plus a few symbols. This is more than ample for our needs, and provides plenty of scope for

the future also.

The variable type is an enumerated data type that may be ‘c’, ‘i’, ‘f’ or ‘s’, representing the

types char, int, float and string respectively. More data types could be added in the future if

required.

The variable value is converted to a string and then transmitted in text format. This is because

the whole message is sent using the DataOutputStream method writeUTF, which can handle

only strings. Conversion from the int and float types to string is a simple process, and char

and string values are simple transmitted with no conversion required.

Field 9 is the checksum of all the previous bytes. This is a check to ensure that the data

received is not corrupted, and is very important when dealing with angle and distance

measurements. This checksum is computed by using the sequential XOR method. To do this,

the message string is first converted to an array of bytes. Then the first byte is XORed with

the XOR identity (0b00000000). The result is then XORed with the next byte, and this

process is repeated until the last byte is reached. The resultant XOR byte is then appended to

the array, which is subsequently converted back to a string, and is now ready for

transmission.

Page 17: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

17

5.2 Example transmission

The following message is an example transmission from a slave (ID 3) to the master (ID 1),

of three variables (named 'X', 'Y' and 'Z') with values 5, 10 and 100 respectively, with a

priority of 9 (highest). Bytes are shown in ASCII format to make it easier to read, and the

symbol | is used as a visual aid to separate packets. The checksum shown is just an example

one.

1|3|9|3|X|5|Y|10|Z|100|67

The same message without the | symbols:

1393X5Y10Z10067

5.3 Transmission of messages

As detailed earlier, our system uses threads to ensure smooth operation, free from

interference from delays caused by transmitting and receiving Bluetooth messages. This

however, presents a problem when another thread wishes to send data using the Bluetooth

thread. This problem is solved by using an inbox/outbox structure, whereby each robot has

two global String arrays that operate in the same manner as an email client’s inbox and

outbox.

When a thread wishes to send a variable, it calls one of the functions bufferChar, bufferInt,

bufferFloat or bufferString. These functions take the variable’s name and value as their

inputs, and append the variable’s name and value to a temporary string. When the thread has

no more variables to buffer and wishes to send the message, it calls the function

writeMessage, which takes the message’s destination address and priority as inputs. This

function then copies the contents of the temporary string into the robot’s outbox, ready to be

sent. After this, the temporary string is emptied and a running counter of the number of

messages in the outbox is incremented.

The Bluetooth transmission thread, sendThread, which is running periodically, checks this

outbox every time it iterates. If any pending messages are found in the outbox, they are then

processed and sent, and the outbox counter decremented accordingly. A message is sent by

called the method writeUTF on the output stream. The thread sendThread is responsible for

adding the source address to the message and calculating the XOR checksum.

5.4 Reception of messages

This works in much the same way as message transmission. The thread receiveThread is

running periodically, and checks the input stream every iteration using the method readUTF.

If there is nothing to read from the stream, then this method returns nothing and the thread

sleeps until the next iteration. When a message is detected on the input stream, it is processed

immediately. This means checking the XOR checksum byte at the end of the message is

correct, before then checking that the source address is correct. If either of these tests fail, the

message is simply dropped. The next job for receiveThread is to extract the variables from

the message.

Page 18: Embedded Control Systems Project Cooperative Control · traverse along a closed track. The track is printed on a white chart and it gradually gradates from white to black along its

18

This is done by checking field 4, the number of variables in the message. For each variable,

its name, type and value are extracted, as one contiguous string, and simple placed in the

robot’s message inbox. A counter for the inbox is then incremented.

Checking of the inbox is done on the main thread, which is also looping indefinitely. For each

message in the inbox, the name and type fields are examined, then the value field is converted

from a string to the appropriate data type. So, for example, if a variable of type ‘i’ (int) is

contained within the message, then its value is converted to type int, and stored in a

temporary variable, incomingInt. There are three other corresponding temporary variables,

for handling the three other possible data types.

The final job to be done is to decide what to do with the variable. This decision is made on

the basis of the variable name, and is represented in our system using a switch statement.

Each case in the switch represents a possible variable name. Inside each case statement is the

code needed to handle that particular variable. For example, when a variable named a is

received, then the switch jumps to the ‘a’ case, which contains code for processing the value

of ‘a’. Note that this decision switch is run on the main thread. If a variable is destined for

another thread, this thread can update a static variable in that thread, rather than passing a

message or triggering an event in that thread.

This system is simple and works well, but it does require all the robots to use a common set

of variable names, to ensure that the data is handled correctly.

5.5 Performance

Whilst the IRCP adds an overhead to the system, the transmission of messages is still a

relatively fast affair. To measure the latency of the system as a whole, a series of ‘ping’ tests

were carried out. These measured the time taken for a message to travel from the master to

the slave and back. With the two Bluetooth threads, sendThread and receiveThread, running

at a frequency of 20Hz, the average time for this round-trip was around 550ms, with the

worst case being around 2000ms. For our application, these timings are acceptable, especially

when you consider that the data to be sent need to only travel one way, and not undertake an

entire round trip.

Whilst some performance is undoubtedly lost due to the use of the IRCP, the features it

offers, particularly data integrity checking and variable names, comfortably outweigh this

drawback.

6. Conclusion

The project was carried out successfully. The performance was as expected out of the control

algorithms. The formation was rigid to an extent and showed resiliency to momentary

communication failures and vision disturbances.