movement - fontys@work research...youbot sources work and how does it perform? in order to answer...

24
Research Report Movement - Fontys@WORK Client : Fontys@WORK Date : 22-09-2017 Version : 1.0 Authors : M. Diep, H. Heijnen, L. van Rossum, J. de Weger, L. Jaeqx, W. Sneijers

Upload: others

Post on 21-Mar-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Research Report Movement - Fontys@WORK

Client : Fontys@WORK

Date : 22-09-2017

Version : 1.0

Authors : M. Diep, H. Heijnen, L. van Rossum, J. de Weger, L. Jaeqx, W. Sneijers

Page 2: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Version

Version Date Author(s) Changes Status

0.1 23-11-2017 M. Diep H. Heijnen L. van Rossum J. de Weger L. Jaeqx W. Sneijers

Initial version Draft

0.2 24-11-2017 M. Diep H. Heijnen L. van Rossum J. de Weger L. Jaeqx W. Sneijers

Added research question 1 Draft

0.3 10-1-2018 M. Diep H. Heijnen W. Sneijers

Added research question 2 and 3 Draft

1.0 12-1-2018 M. Diep H. Heijnen W. Sneijers

Added Conclusion and Recommendations final

Page 3: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Table of contents Table of contents 3

1. Problem statement and project objectives 4

1.1 Background 4

1.2 Problem statement 4

1.3 Goal of the project 4

1.4 Scope of the project 4

2. Research focus and research (sub-)questions 5

2.2 How does robot movement work 5

R1 How does the movement stack provided in the Youbot sources work and how does it

perform? 7

R1.1 Which sensors and actuators does the Youbot provide? 9

R1.2 What components make up the source’s movement stack? 11

R1.2.1 Creating a map: 11

Navigating in a map: 12

R1.2.3 Simultaneous Localization and Mapping (SLAM) 12

R1.3 What components does ROS recommend to use for navigation? 13

R2 How can we create a map of the arena? 14

R2.1 Are the provided sources capable of creating a map and if so how? 14

R2.2 What methods are available in ROS for creating a map? 14

R3.1 Are the provided sources capable of navigating a map and if so what is the

accuracy? 16

R3.2 How does the movement stack of ROS work? 16

R3.2.1 Move-base 16

R3.2.2 Local planner 16

R3.2.3 Global planner 17

R3.3 How can we specify a goal to the robot? 18

R3.4 What is the accuracy of our implementation? 19

R1 How does the movement stack provided in the Youbot sources work and how does it

perform? 22

R2 How can we create a map of the arena? 22

R3 How can we navigate a map to reach certain points? 22

Recommendations 23

References 24

Page 4: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

1. Problem statement and project objectives

1.1 Background

The Fontys@WORK team has a robot for the RoboCup@WORK competition. This robot has been bought specifically for this competition. The software that was created, by the Fontys@WORK team, for this robot wasn’t complete or very user friendly. The software was created by one of the students working on the robot to try and get it working. It doesn’t have a clear architecture or a user interface that is easily usable. Also another group is simultaneously designing and building a new robot for this competition. This means the new software architecture has to be portable to this new platform.

1.2 Problem statement

There is no clearly documented and correctly implemented software architecture for the Fontys@WORK team to use in the RoboCup@WORK competition. To be able to compete in the RoboCup@WORK we need to lay the foundation for a well-documented and modular code base.

1.3 Goal of the project

The goal of the project is to provide a software architecture which the Fontys@WORK team can use for the RoboCup@WORK competition. It is important that this architecture is portable between the current Youbot and the new robot platform. Also this architecture should abstract complex robot behavior, like movement, path planning and arm kinematics, into an easy to use API. Another important aspect is that the architecture is clearly documented for others. This because our architecture will be used by others teams in the future.

1.4 Scope of the project

The main limitation is that we don’t deliver a complete solution to win the RoboCup@WORK competition. Instead we will provide a base for others to use to more easily use and debug the platform going to and during the competition, and to expand on in the future.

Page 5: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

2. Research focus and research (sub-)questions

2.2 How does robot movement work

In order to compete in the competition the robot needs to move to certain points (work

area’s) in the arena to perform tasks. Summarized from the RoboCup@Work

(RoboCup@Work Rulebook, 2017) the robot needs to know these points. A referee system

will tell the robot to move to a specific area to do a specific task (for example pick up an

object). This means the robot needs to know a couple things in advance:

● The layout of the arena (i.e. a map).

● It’s position in this map.

● The positions of the work area’s

The RoboCup@Work guidelines also state that each team has a period of an hour before

their turn to map the arena and setup the points.

Besides the things above there are also a lot of concepts within ROS which need to add up

in order for the robot to successfully move in the arena. For example how does the

movement stack of ROS work? Also there is an existing codebase from the competition for

the Youbot. It could give a good starting point. Combining these things gives the following

research questions:

R1. How does the movement stack provided in the Youbot sources work and how does it

perform?

1. Which sensors and actuators does the Youbot provide?

2. What components make up the source’s movement stack?

3. What components does ROS recommend to use for navigation?

R2. How can we create a map of the arena?

1. Are the provided sources capable of creating a map and if so how?

2. What methods are available in ROS for creating a map?

3. How can we use a created map during the competition?

R3. How can we navigate within a map to reach certain points?

1. Are the provided sources capable of navigating a map and if so what is the

accuracy?

2. How does the movement stack of ROS work?

3. How can we specify a goal to the robot?

4. What is the accuracy of our implementation?

Since these questions rely heavily on existing sources and documentation we want to use a

more practical approach. Meaning that question will be answered first by doing literature

research. This research will then be validated on the Youbot with example code.

Page 6: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Expected results:

1. We expect that we can create a map in advance and move the robot within this map

during the competition use predefined goals.

2. A report comparing the movement accuracy of the provided sources and our own

implementation.

3. An advice for the next iteration containing potential improvements to the accuracy

and repeatability of the robot’s movements.

Planning:

The research will be done in parallel with the tasks in the project, and we update the

research documents based on what tasks we have done for the project.

Page 7: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

R1 How does the movement stack provided in the

Youbot sources work and how does it perform?

In order to answer this question we started by figuring out what sensor sources and actuator

outputs the Youbot provides (R1.1). In short a LiDAR and wheel encoders.

Next we looked at how the movement stack of the provided sources works (R1.2). It uses

the move_base and AMCL package from ROS with a few additions.

To determine whether or not we should continue using these packages we looked if there

were already projects using this technology and what the opinion of the ROS community

was. After doing some research and having a discussion on whether or not to build our own

navigation stack from scratch we decide to keep these packages. There are several

industrial projects that have proven that these packages work and the general census in the

ROS community was that these are a good starting point.

We now have a basic understanding of the hardware and software which make up the

provided movement stack of the Youbot. Next we want to conclude this question by testing

the navigation stack by sending goals and observing what the robot does in a real world test

environment. Parameters that we will pay attention to are spatial drift and rotational drift.

Testing setup:

1. Build an area and mark the position of the starting point 2. Map the area and save the map 3. Drive around with the joystick teleop to another point on the map 4. Save the position and rotation coordinates 5. Mark the position of the wheels with tape 6. Drive to some other position 7. Let the robot navigate back to the previous point.

We repeat the last two steps three times, to see if the errors are incidental or accumulative.

Page 8: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Figure 1 Robot in testing area.

Figure 2 Rviz map

Page 9: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

R1.1 Which sensors and actuators does the

Youbot provide?

For the input and output of the movement stack we need to know what the capabilities of the

robot are, so we can deal with any limitations and decide how relevant the sources are.

Inputs

The provided Youbot comes with the following sensors related to movement:

● 1x front-mounted Hokuyo UBG-04LX-F01 laser scanner

● Odometry (rotation) from the wheel encoders

An additional identical laser module is available, but was not hooked up yet.

Field of view 240°

Angular resolution 0.36°

Minimum range 20 mm

Maximum range 5600 mm

Range resolution 1 mm

Update rate 28 milliseconds/scan

The specifications of the Hokuyo laser scanner (Hokuyo UBG-04LX-F01, 2017)

This would provide for rather accurate results, as this laser module is one of the higher-end

ones.

Page 10: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Outputs

The Youbot has the following actuators:

● 4x mecanum wheels

Figure 3 Mecanum wheels on the Youbot

Mecanum wheels(figure 3) are designed to be able to move in any direction. This allows the

robot to move freely on the x and y axes. The contact points of the mecanum wheels are

made from a hard material, which may cause the wheels to slip. This might be some

consideration on how much we value data from this source, as it might be unreliable.

Page 11: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

R1.2 What components make up the source’s movement

stack?

In order to improve the implementation we first need to examine how it works. Knowing it’s

ROS it consists of launch files which spawn nodes. Also, using ROS, we can make a

decomposition of these nodes and their relations. Note that there are two different launch

files for creating a map and navigating in a map.

R1.2.1 Creating a map:

Figure 4 component diagram for launch structure while creating map

We can break this diagram down in several components. Each component has a different

task / responsibility.

Lasers (hokuyo_front):

This component is tasked with initializing the front Hokuyo LiDAR and publishing its data to

ROS.

Youbot_driver:

This component initializes the robot base. This consists of the mecanum wheel drivers and

encoders. It also ‘uploads’ the description (dimensions, 3D-models for simulation and

components for simulation) of the youbot to ROS. It then provides topics to control the

wheels and read the odometry.

2dslam:

This component creates a map from LiDAR data in ROS. This data can be saved to a map

file. The approach used gmapping (Simultaneous Localisation and Mapping (SLAM), 2017)

to interpret the LiDAR data. Move_base is used to move the robot through the arena with a

joystick while making the map.

Page 12: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Navigating in a map:

Figure 5 component diagram for launch structure while navigating

We can break this diagram down in several components. Each component has a different

task / responsibility.

Lasers (hokuyo_front):

This component is tasked with initializing the front Hokuyo LiDAR and publishing its data to

ROS.

Youbot_driver:

This component initializes the robot base. This consists of the mecanum wheel drivers and

encoders. It also ‘uploads’ the description (dimensions, 3D-models for simulation and

components for simulation) of the youbot to ROS. It then provides topics to control the

wheels and read the odometry.

Move_base_global:

This component starts the ROS navigation stack. It uses the ROS move_base(move_base,

2017) component for the navigation. It uses AMCL(AMCL, 2017) to provide a localization

from the map to the robot using the LiDAR and odometry of the Youbot. Besides it launches

a filter for the output to the wheels.

R1.2.3 Simultaneous Localization and Mapping (SLAM)

One of the packages used in the navigation stack implemented a approach called SLAM.

We however didn’t know what it was or how it worked. After doing some research we found

a paper called Simultaneous Localisation and Mapping (SLAM): Part I The Essential

Algorithms.

Page 13: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

R1.3 What components does ROS recommend to use for

navigation?

For navigating ROS provides an implementation called to ROS navigation stack(navigation,

2017). This is meant to be to be a general purpose implementation to be used for navigation

in both differential drive and holonomic wheeled robots.

We’ve had a short discussion on whether or not use to this package or to create our own

navigation stack. But seeing as this is a standard within ROS that has proven industrial

implementations(RobotsUsingNavStack, 2017) and the KUKA implementation that was

provided also used this, we decided against building our own seeing as this would be a

significant time investment.

The navigation stack has a number of requirements that need to be fulfilled for it to work:

● It assumes that the mobile base is controlled by sending desired velocity commands

to achieve in the form of: x velocity, y velocity, theta velocity.

● A tf tree that has proper transforms.

● Sensor streams that provide odometry data.

● A planar laser mounted somewhere on the mobile base. This laser is used for map

building and localization.

Page 14: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

R2 How can we create a map of the arena?

This question is important because we need to create a map before the competition run starts. To answer this question we first looked at the available sources, specifically if they contained a example for creating the map. The examples used gmapping SLAM as approach.

Next step was to look at the available methods for creating a map. The requirement that the map needed to be made in advance limited the options. After testing gmapping SLAM was chosen because the resulting map was of better quality. We did observe that it is important to not rotate in place while creating the map. This results in warped walls.

R2.1 Are the provided sources capable of creating a map and if

so how?

As mentioned in chapter 2 there are existing sources from the previous team and RoboCup@Work available. In question R1.2 we already dissected the navigation stack from the available sources. It revealed that there is a premade way of creating a map using a SLAM (Simultaneous Localization and Mapping) variant called gmapping. In short, as found on the gmapping(gmapping, 2015) information page, this algorithm creates a grid map of the arena while the robot is driven manually by an operator. After creating the map, it needs to saved to a file. This file can then be used at a later stage with the ROS navigation stack.

R2.2 What methods are available in ROS for creating a map?

When using ROS there are many ways to create a map (from many different input devices). However a requirement (as mentioned in chapter 2) is that the map needs to be created in advance. This limits the options severely. In fact, using ROS, there are two viable options.

• gmapping SLAM • Hector SLAM

Notice that both are an implementation of the SLAM approach (which means mapping while driving). There are more variants within ROS using the SLAM approach (rtabmap_ros, 2017 karto_slam, 2017).However these are not actively develop and lack good documentation and a supporting community. There is (except from the internal algorithm) one major difference between gmapping and hector SLAM. Gmapping SLAM use both LiDAR sensor(s) and odometry (wheel encoders) for position estimation and creating the map. Hector SLAM uses only the LiDAR sensor(s).

To select the variant to use some testing is needed. We chose to test in a real environment with the Youbot. We also thought about using a simulator, but because a simulator does not always reflect the real world we decided against that. Also the robot needs to be driven manually, so there is no hazard that it drives into a wall (or person) by itself.

The shape of the test arena chosen was mostly square with only a few unique features. This because the RoboCup@Work arena is also square, though it has more unique features. Also, after a discussion with people from the engineering department of Fontys Eindhoven,

Page 15: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

we learned using similar walls all around makes it more difficult for the LiDAR sensors to distinguish them. This in effect makes the test arena more difficult then what is eventually needed.

We started with hector SLAM. This however didn’t give the desired result:

We noticed that, while driving around, the map started rotating more and more. This is because the walls are so similar that the robot loses its point in space. One observation made is that the rotation is affected more than the linear displacement.

Next gmapping SLAM is tested. The results were better, but also not as desired.

One observation is that the straight corners where warped when driving the robot around. This was at first puzzling and cost a lot of time to figure out. In the it was because we mapped by driving a bit and then rotating the robot in place. This in-place rotation gives this offset. The solution is to drive in as any straight lines as possible.

From the observations during testing we selected gmapping for the map creation. This because the map is of better quality. Also it is more robust then hector SLAM when using it in an arena with few unique features. Using this approach means we save the map to a file. This file can be used by the navigation stack in at a later time.

Page 16: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

R3.1 Are the provided sources capable of navigating a map

and if so what is the accuracy?

As a starting point we looked at the available sources from RoboCup@Work. They provide a move_base implementation which uses the Youbot odometry and a front LiDAR sensor to navigate a known map. However we were not able to get accurate results using these sources. After driving a short distance the robot started rotating in place and did not reach its goal. If the goal was closer than the distance at which the Youbot started rotating it did reach the goal. However after reaching the goal the robot overshoots and rotates in place very slowly.

R3.2 How does the movement stack of ROS work?

The movement stack, also called the navigation stack, is a software stack that takes information odometry and sensor streams and uses this information to output velocity commands to the robot. For the navigation stack to work there are several prerequisites, the robot needs to be running ROS, have a tf transform tree and be publishing data on the correct topics using the right sensor types.

R3.2.1 Move-base

The ROS navigation stack uses the move-base package(move_base, 2017) as a central point to combine several components.

The move_base node combines the data from the localization, sensor data and map data and feeds this information to a package that builds a global and local cost map (costmap_2d, 2017). When it receives a goal to drive to it sends this to a global and local planner that that calculate a path and publish velocity commands for the robot to execute.

R3.2.2 Local planner

There are several possible solutions and techniques we can use for use in the local planner, we researched and tested several: base_local_planner, dwa_local_planner,

Page 17: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

eband_local_planner and teb_local_planner.( base_local_planner, 2017 dwa_local_planner, 2018 eband_local_planner, 2017 teb_local_planner, 2016)

The project used the base_local_planner when we got it but after researching other peoples experiences and testing it for ourselves we weren’t satisfied with its performance. That’s why after looking at other available solutions we found 3 possible replacements:

When researching these by looking at their documentation, already working projects and other people’s experiences we quickly realized that the teb_local_planner would not work for us because the current version only works with non-holonomic type robots and we use a robotbase with omni wheels(holonomic).

We tried to implement the eband_local_planner since this planner tries to minimise computational requirements which is important considering that we run all of our software on a laptop connected to the wheel platform. however the current implementation is mostly tested for non-holonomic drives, while it has support for holonomic type drives this functionality was developed very early on, hasn’t been tested since and isn’t being actively developed. This is why we decided against using this planner, this project needs to be able to stay up-to-date and we’d like to be able to periodically update all the software so that we can take advantage of new developments which won’t be possible when software isn’t actively being developed.

In the end we’ve chosen to use the dwa_local_planner, there are several reasons for this. It is fairly well supported and kept up-to-date, is has support for holonomic type drives and in comparison to the trajectory_planner that was originally being used it allows for more freedom of movement making it possible to take full advantage of the omni wheels by driving backwards and sideways. From what we’ve read on the accuracy of this planner it exceeds the trajectory_planner (ROS Navigation Tuning Guide, 2016) and our own testing confirms this which is explained further in chapter 3.4.

R3.2.3 Global planner

There are, like in the local planner, several options for the global planner (BaseGlobalPlanner, 2017 CarrotPlanner, 2018). The examples from RoboCup@Work use the BaseGlobalPlanner. We decided to test this planner in a test arena.

We decided to not further investigate other global planners because the BaseGlobalPlanner already gave good results. It is not worth the amount of time for the gain it could possibly give.

Figure 6 Example of a global path

Page 18: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

R3.3 How can we specify a goal to the robot?

The decision was made to use the move_base(move_base, 2017 package for navigation. The move_base wiki page mentions two ways of interfacing; using simple navigation goals with no feedback and using a action client(SimpleActionClient, 2017) with feedback. We use both variants in different situations.

When using our map_marker GUI we use the simple navigation goals. This because we need no feedback on the navigation progress. The operator manually gives goals, checks them and marks them. The most simple implementation was adequate.

When running in competition mode (connected to the referee box) we use the action client interface. Feedback is needed when the robot reaches it’s set goal (or not) because other steps in the given task need to be executed. Because we needed the feedback we implemented to more complex interface.

Page 19: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

R3.4 What is the accuracy of our implementation?

We tested the accuracy of our implementation by letting the robot navigate to several waypoints on the map.

Figure 7 Image of the testing area.

We first mapped the testing area before defining a few points. We used three points on the map:

• Home/Starting • Point 1: In another “room” • Point 2: Behind an obstacle

For each position we defined, we marked the position of each wheel with a piece of tape, and we saved this position in our GUI tool. We then let the robot navigate from point to point, noting down the absolute drift and the absolute rotation error. This we measured with a tape measure and a set square. We let the robot start at point 1, go to home, go to point 2 and then we repeated that cycle 5 times.

Page 20: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

The results are as follows:

Figure 8 Graph: Errors at point 1

Figure 9 Graph: Errors at point Home.

Page 21: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Figure 10 Graph: Errors at point 2

We observed that the drift isn’t cumulative, and whatever drift there was, could be said to be noise since it varied. We did note the drift was consistently bigger for the home position, which might be because of how close the rear of the robot was to the wall, causing the rear radar not to be as effective. It could also be that the path planning algorithm would not let the robot come any closer to the wall.

Figure 11 Graph: All points, chronological

When put them chronologically, spikes are visible for the home position, but otherwise the drift looks reasonably consistent and small. We tried to run the same procedure for the original code with one LiDAR, but this resulted in such a big drift that the robot failed to reach the waypoints. This caused the recovery behavior to be active until it was cancelled manually.

Page 22: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Conclusion

A big part of the project was setting up a working movement stack. This document explained all the research done about this subject. We split up the research into three main research question.

R1 How does the movement stack provided in the Youbot

sources work and how does it perform?

An important start was figuring out how the current provided sources worked (if they worked at all). Also we researched which components (sensors, actuators) the youbot has. In the end answering this question first gave a good starting point. While the provide sources didn’t work at first, after a couple of weeks of researching and tweaking we managed to get basic functionality working. We were able to drive to a point in the map but after that the robot had lost its position and it couldn’t go to another point. Based on this research, the decision was made to build up the movement stack ourselves instead of using the provided sources.

R2 How can we create a map of the arena?

Successfully navigating the map means having a map in the first place. Answering this question started with looking at the sources from RoboCup@Work for the youbot. We wanted to determine if these sources were capable of creating (and saving) a map for later use. After testing the answer was yes. We also looked at alternatives for map creation. In the end we determined the already available implementation was adequate.

R3 How can we navigate a map to reach certain points?

With the ability to create a map in advance we still needed to navigate it. This was the biggest (and most difficult) part of the research. Again we started by looking at the available sources from RoboCup@Work. Though we could get them working after a while, it was very unreliable (the robot did not reach its goal and started rotating). We decided to build our own navigation stack using as many standard ROS components as possible.

We started by investigating the possibilities for a navigation package in ROS. When using ROS the only (well supported) option is move_base. A package which creates a path from a goal message, avoids walls and dynamic obstacles and recovers when it cannot reach its goal.

Because we were using move_base as our navigation package we needed to select a global and local planner. We decided to do this by testing different variants on the youbot. For the global planner we quickly decided to use the default option. Its performance was good and reliable. For selecting the local planner we tested a total of 4 options. After testing the DWA planner came out on top. It was the only package which could take advantage of the holonomic nature of the robot (omni wheels). For example it allows the robot to move sideways and backwards.

Page 23: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

Recommendations

We mainly want to make recommendations for improvements to the navigation stack (efficiency, speed, etc). There are two main points where we think improvements can be made; the global and local planners.

The global planner we used was the default option. It works well and is reliable. However it does not always give the shortest path to the goal. Further researching other alternatives could give a time gain.

For the local planner we selected the available DWA planner from ROS. It allows for the robot to fully use the capabilities of its omni wheels. Though it does make the robot move slow in corners. This because the LiDAR sensors cannot see the edges around the corners. Also it is quite a resource intensive planner (which slows the entire navigation stack down). A good solution could be to write a custom local planner to get better results.

Page 24: Movement - Fontys@WORK Research...Youbot sources work and how does it perform? In order to answer this question we started by figuring out what sensor sources and actuator outputs

References

bhaskara. (2017`, 12 12). slam_karto. Retrieved from github: https://github.com/ros-

perception/slam_karto

Christian Connette, B. M. (2017, 10 02). eband_local_planner. Retrieved from wiki.ros.org:

http://wiki.ros.org/eband_local_planner

Eitan Marder-Eppstein, D. V. (2015, 06 29). costmap_2d. Retrieved from http://wiki.ros.org:

http://wiki.ros.org/costmap_2d

Eitan Marder-Eppstein, E. P. (2017, 02 25). base_local_planner. Retrieved from

http://wiki.ros.org: http://wiki.ros.org/base_local_planner

Eitan Marder-Eppstein, S. C. (2018, 01 10). carrot_planner. Retrieved from wiki.ros.org:

http://wiki.ros.org/carrot_planner

Gerkey, B. (2015, 12 09). gmapping. Retrieved from wiki.ros.org:

http://wiki.ros.org/gmapping

Gerkey, B. P. (2017, 07 07). amcl. Retrieved from http://wiki.ros.org: http://wiki.ros.org/amcl

Hokuyo UBG-04LX-F01. (n.d.). Retrieved from https://www.hokuyo-aut.jp:

https://www.hokuyo-aut.jp/search/single.php?serial=164

Hugh Durrant-Whyte, F. I. (n.d.). Simultaneous Localisation and Mapping (SLAM). Retrieved

from http://www-personal.acfr.usyd.edu.au/tbailey/papers/slamtute1.pdf

IsaacSaito. (2016, 12 10). SimpleActionClient. Retrieved from wiki.ros.org:

http://wiki.ros.org/actionlib_tutorials/Tutorials/SimpleActionClient

Labbe, M. (2017, 09 08). rtabmap_ros. Retrieved from wiki.ros.org:

http://wiki.ros.org/rtabmap_ros

Lu!!, D. (2017, 04 18). global_planner. Retrieved from http://wiki.ros.org:

http://wiki.ros.org/global_planner

Marder-Eppstein, E. (2016, 12 26). move_base. Retrieved from http://wiki.ros.org:

http://wiki.ros.org/move_base

Marder-Eppstein, E. (2016, 12 26). move_base. Retrieved from http://wiki.ros.org:

http://wiki.ros.org/move_base

Marder-Eppstein, E. (2017`, 03 10). navigation. Retrieved from http://wiki.ros.org:

http://wiki.ros.org/navigation

Marder-Eppstein, E. (2018, 1 11). dwa_local_planner. Retrieved from wiki.ros.org:

http://wiki.ros.org/dwa_local_planner

Nico Hochgeschwender, R. K. (2017, January 23). RoboCup@Work Rulebook. Retrieved

from http://www.robocupatwork.org:

http://www.robocupatwork.org/download/rulebook-2017-01-24.pdf

Rösmann, C. (2016, 12 27). teb_local_planner. Retrieved from wiki.ros:

http://wiki.ros.org/teb_local_planner

SawYer. (2017, 03 07). RobotsUsingNavStack. Retrieved from http://wiki.ros.org:

http://wiki.ros.org/navigation/RobotsUsingNavStack

SwarmLab@Work. (2017, November). swarmlabatwork/slaw_navigation/. Retrieved from

GitHub.com:

https://github.com/swarmlab/swarmlabatwork/tree/master/slaw_navigation

Zheng, K. (2016, 09 02). ROS Navigation Tuning Guide. Retrieved from kaiyuzheng:

http://kaiyuzheng.me/documents/navguide.pdf