sachin chitta, ioan sucan, acorn pooley · sachin chitta, ioan sucan, acorn pooley with...

40
Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao, Jon Binney, Adam Leeper, Julius Kammerl, David Gossow, Vincent Rabaud, Dave Hershberger and the ROS and PR2 communities

Upload: others

Post on 21-Feb-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Sachin Chitta, Ioan Sucan, Acorn Pooley

with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao, Jon Binney, Adam Leeper, Julius

Kammerl, David Gossow, Vincent Rabaud, Dave Hershberger and the ROS and PR2 communities

Page 2: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

What is MoveIt!

•MoveIt!- Software for building mobile manipulation applications

❖ Motion Planning, Kinematics, Collision Checking integrated with Perception, Grasping, Control and Navigation for Mobile Manipulation

2

Page 3: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Motivation•Build state of the art software platform for robotics

applications and research

•“Simple things should be easy”❖ Provide out-of-the-box experience

• easy to setup with new robots - Setup Assistant

❖ Easy to use APIs - C++ and Python

•“Allow users to dive deeper to address harder problems”

❖ Flexible platform - easy to add new components

•Performance❖ design for high performance

3

Page 4: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Motivation

•Developing new higher-level capabilities is time-consuming❖ building every capability from scratch is a waste of effort

•Environment awareness critical for new applications❖ Integrated 3D perception can provide situational awareness, improving

safety, especially in human-robot collaborative tasks

•Motion Planning important for dynamic changing environments

❖ essential for maintaining safety and performance in human-robot collaborative tasks

•Constrained manipulation tasks are hard to solve❖ increasingly complex types of constraints need to be handled

4

Page 5: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

MoveIt!: Evolution

•MoveIt! has evolved from the arm_navigation and grasping pipeline set of stacks

❖ essentially a rewrite from scratch

❖ ROS API almost the same

❖ incorporates lessons learnt

5

Page 6: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Mobile Manipulation: State of the art

6

Chitta, Jones, Ciocarlie, Hsiao, Sucan, 2011

Page 7: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Robots Using Our Software

7

Page 8: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Application - ROS-Industrial

8

*collaboration between Willow Garage, SWRI and Yaskawahttp://www.rosindustrial.org

Page 9: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

MoveIt!

9

Page 10: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Application - ROS-Industrial

10

Rob@Work running MoveIt!

8X

Page 11: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

MoveIt!

•Technical Capabilities❖ Collision Checking: fast and flexible

❖ Integrated Kinematics

❖ Motion Planning

• fast, good quality paths

• kinematic constraints

❖ Integrated Perception for Environment Representation

❖ Standardized Interfaces to Controllers

❖ Execution and Monitoring

❖ Kinematic Analysis

11

Page 12: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

MoveIt!

12

Page 13: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Collision Checking

•FCL - Flexible Collision Library*❖ parallelizable collision checking

❖ Maximum about 2-3,000 full body collision checks for the PR2 per second

✓ with realtime sensor data

❖ + high fidelity mesh model

•Proximity Collision Detection❖ Uses 3D distance transform to determine distance

to nearest obstacle and gradient

❖ + very fast - 40 to 80,000 collision checks per second for the full body of the PR2

❖ - not as accurate

13

*Pan, Sucan, Chitta, Manocha - 2012

Page 14: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Motion Planning

•Plugin interface for planners

•Integration with robots through MoveIt!

•Automatically configured using the MoveIt! Setup Assistant

❖ Sampling based planners (OMPL)

❖ Search Based Planning Library (SBPL)

14

*http://ompl.kavrakilab.org

*http://www.ros.org/wiki/sbpl

*OMPL - Sucan, Moll, Kavraki (Rice University), SBPL - Max Likhachev (CMU)

Page 15: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Easy Setup and Configuration

15

Page 16: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Easy Setup and Configuration

16

Page 17: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Easy Setup and Configuration

17

Page 18: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Easy Setup and Configuration

18

Page 19: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Easy Setup and Configuration

19Live Demo by Dave Coleman later today!!

Page 20: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Integrating Perception

20

Octomap: octomap.sf.netPoint Cloud Library: pointclouds.org

Page 21: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Reactive Motion

21

* Live Demo later!!

Page 22: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Constraint Representation

22

❖ Joint Constraints

❖ Position Constraints

❖ Orientation Constraints

❖ Visibility Constraints

Page 23: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Orientation Constraints

23

Page 24: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Applications: Benchmarking

24

could represent data obtained directly from the sensorson a robot.

2) Robot - The particular robot used for the benchmarkexperiments is specified using the ROS robot descrip-tion formats.

3) Goals - Associated with the environment is a set ofgoals for motion planning, e.g., goals may be specifiedin areas of the environment where objects may betypically found. A robot-agnostic goal specifies thedesired position and orientation for a particular partof the robot, e.g., the end-effector or tool of therobot. These goals may need to be adjusted to accountfor the geometry of individual robots. Robot specificgoals position individual robots in desired poses andneed to be specifically implemented for each robot.Goals also include constraints on the different links orjoints of the robot. The current implementation of ourbenchmarking approach allows for the specification ofposition and orientation constraints on the links of arobot, joint constraints restricting the range of motionof a joint and visibility constraints that allow sensorsto maintain visibility of links on the robot or of regionsin the environment.

4) Robot Start State - The robot start state is specific toeach robot and provides complete information on thestate of the robot at the start of a motion plan.

The benchmark problems are designed to exercise theplanners through a wide-range of scenarios. Some of theenvironments were setup with the express intention of testingthe performance of planners in difficult situations, e.g thenarrow passageway problem. Note that we do not intendfor this set of benchmarks to be the final set of exhaustivebenchmarks that all planners will be run against. Instead, wehope, with the help of the community to build upon this setand include environments and problems that exercise othercapabilities of planners, e.g., dealing with dynamic obstaclesor uncertainty. We will now describe in detail the full setof example benchmark experiments that we have currentlydesigned and configured.

A. Testing EnvironmentsFig. 4, Fig. 5, Fig. 6 and Fig. 7 illustrate the sample

environments that we use in all our experiments. The robotused in all experiments is the simulated PR2 robot. ThePR2 is a state of the art mobile manipulation robot with anomni-directional base, two arms and parallel jaw grippers.The robot also has multiple sensors including a RGB-Dsensor, two laser scanners, an IMU, tactile sensors and stereocameras.

The environments represent just a sample of the typesof scenarios that mobile manipulation robots can expect tofind in typical human environments. The first environment(Fig. 4) represents a scenario where the robot needs to pickupobjects from a shelf containing multiple parts. The secondenvironment (Fig. 5) represents a typical kitchen scenariowhere the robot may be expected to pickup objects frommultiple places on the countertop or in the shelves. The third

Fig. 4. The INDUSTRIAL benchmark environment: The pink spheresindicate desired goal locations for the right end-effector of the PR2 robot.The arrows indicate the desired orientation of the end-effector.

Fig. 5. The KITCHEN benchmark environment. This environment ishuman-scale.

environment (Fig. 6) represents a scenario where the robotis expected to move its arm through a narrow passageway.The position of the wall restricts the workspace availablefor the arm considerably. The fourth environment (Fig. 7)again represents a typical kitchen environment where therobot might be expected to move an object around on a table.

The motion planning problems we chose to implement forthis paper are derived from typical pick and place actionsin human environments. Our problems specify desired goalconfigurations for the PR2’s end effector in typical positionsfrom where objects may be picked up or where objects maybe placed down. Fig. 4, Fig. 5 and Fig. 7 show some ofthe desired goal configurations for the right end-effector ofthe PR2 robot (indicated by the red numbered markers).The goals in Fig. 6 are designed to force the right armof the robot to move through the narrow constrained spacebetween the robot and the wall. Additional constraints maybe separately specified in the motion planning problem, e.g.,a dual-arm motion of the robot may require the two arms to

could represent data obtained directly from the sensorson a robot.

2) Robot - The particular robot used for the benchmarkexperiments is specified using the ROS robot descrip-tion formats.

3) Goals - Associated with the environment is a set ofgoals for motion planning, e.g., goals may be specifiedin areas of the environment where objects may betypically found. A robot-agnostic goal specifies thedesired position and orientation for a particular partof the robot, e.g., the end-effector or tool of therobot. These goals may need to be adjusted to accountfor the geometry of individual robots. Robot specificgoals position individual robots in desired poses andneed to be specifically implemented for each robot.Goals also include constraints on the different links orjoints of the robot. The current implementation of ourbenchmarking approach allows for the specification ofposition and orientation constraints on the links of arobot, joint constraints restricting the range of motionof a joint and visibility constraints that allow sensorsto maintain visibility of links on the robot or of regionsin the environment.

4) Robot Start State - The robot start state is specific toeach robot and provides complete information on thestate of the robot at the start of a motion plan.

The benchmark problems are designed to exercise theplanners through a wide-range of scenarios. Some of theenvironments were setup with the express intention of testingthe performance of planners in difficult situations, e.g thenarrow passageway problem. Note that we do not intendfor this set of benchmarks to be the final set of exhaustivebenchmarks that all planners will be run against. Instead, wehope, with the help of the community to build upon this setand include environments and problems that exercise othercapabilities of planners, e.g., dealing with dynamic obstaclesor uncertainty. We will now describe in detail the full setof example benchmark experiments that we have currentlydesigned and configured.

A. Testing EnvironmentsFig. 4, Fig. 5, Fig. 6 and Fig. 7 illustrate the sample

environments that we use in all our experiments. The robotused in all experiments is the simulated PR2 robot. ThePR2 is a state of the art mobile manipulation robot with anomni-directional base, two arms and parallel jaw grippers.The robot also has multiple sensors including a RGB-Dsensor, two laser scanners, an IMU, tactile sensors and stereocameras.

The environments represent just a sample of the typesof scenarios that mobile manipulation robots can expect tofind in typical human environments. The first environment(Fig. 4) represents a scenario where the robot needs to pickupobjects from a shelf containing multiple parts. The secondenvironment (Fig. 5) represents a typical kitchen scenariowhere the robot may be expected to pickup objects frommultiple places on the countertop or in the shelves. The third

Fig. 4. The INDUSTRIAL benchmark environment: The pink spheresindicate desired goal locations for the right end-effector of the PR2 robot.The arrows indicate the desired orientation of the end-effector.

Fig. 5. The KITCHEN benchmark environment. This environment ishuman-scale.

environment (Fig. 6) represents a scenario where the robotis expected to move its arm through a narrow passageway.The position of the wall restricts the workspace availablefor the arm considerably. The fourth environment (Fig. 7)again represents a typical kitchen environment where therobot might be expected to move an object around on a table.

The motion planning problems we chose to implement forthis paper are derived from typical pick and place actionsin human environments. Our problems specify desired goalconfigurations for the PR2’s end effector in typical positionsfrom where objects may be picked up or where objects maybe placed down. Fig. 4, Fig. 5 and Fig. 7 show some ofthe desired goal configurations for the right end-effector ofthe PR2 robot (indicated by the red numbered markers).The goals in Fig. 6 are designed to force the right armof the robot to move through the narrow constrained spacebetween the robot and the wall. Additional constraints maybe separately specified in the motion planning problem, e.g.,a dual-arm motion of the robot may require the two arms to

Fig. 6. The NARROW PASSAGE benchmark environment.

Fig. 7. A set of goals for manipulation tasks on a table in the TABLETOPenvironment.

move in a constrained manner. Note that the desired goalsdo not constraint the redundant degrees of freedom of thePR2 robot arm in any manner. Each planner is thus free tochoose the joint configuration that satisfies the desired goalconfiguration, e.g., by using inverse kinematics.

The goal are not specified as a single pose for the end-effector. They are instead specified as goal regions, i.e., smallregions in SE3 where the robot’s end-effector should beable to move to. Goal regions provide the planners with asmall amount of tolerance in terms of determining when theyare at the goal. Planners can choose to sample individualgoal poses inside the goal regions or use the goal regionsdirectly if possible. The size and shape of the goal regionsare parameters that can be set based on the overall taskbeing executed by the robot, e.g., grasping certain objectsmight require very tight tolerances on the goal region for

Fig. 8. The collision checking models used by the two planning libraries- SBPL (left) and OMPL (right)

the planners.

B. Planners

To validate our benchmarking approach, we tested it withtwo types of motion planners: search-based planners from theSBPL library [23] and randomized planners from the OMPLlibrary [20].

The underlying search-based planner used here is theARA* planner [24]. A set of discretized motion primitivesin joint or cartesian space are used with the planner togenerate motion plans for the arms of a robot [25], [26].An inflated heuristic derived from a 3D Djikstra search inthe environment is used to accelerate this planning process.The ARA* planner provides bounds on sub-optimality withrespect to the graph that represents the planning problem.We configured the search-based planners to return the firstsolution they find and set the sub-optimality bound parameter� to 100. The ARA* planner is an anytime planner capableof improving the solutions that it finds over time but wechose not to use these capabilities of the planner. Thecost function for the search-based planner incorporated thedistance traveled in joint space.

The randomized planning approaches tested here areKPIECE [27], LBKPIECE [27], SBL [28], RRT* [29], RRTand RRT-Connect [30]. The cost function for RRT* is againthe distance traveled in joint space. RRT* was configured toreturn the first solution that was found. All these plannersare implemented in the OMPL library and use the new FCLcollision checking library [31].

It should be noted here that the performance of plannerscan depend on a large number of factors. In particular,collision checking is a critical component of motion planningand also often the most expensive. The two motion planninglibraries we benchmark use different collision representationsfor the robot(Figure 8) and also employ different approachesto the collision representation for the environment. Thecollision representation used by the search-based plannersfor representing the robot is more conservative but an orderof magnitude faster than the mesh-based representation usedfor the randomized planners. However, the representation ofthe environment for the search-based planners uses a distance

Fig. 6. The NARROW PASSAGE benchmark environment.

Fig. 7. A set of goals for manipulation tasks on a table in the TABLETOPenvironment.

move in a constrained manner. Note that the desired goalsdo not constraint the redundant degrees of freedom of thePR2 robot arm in any manner. Each planner is thus free tochoose the joint configuration that satisfies the desired goalconfiguration, e.g., by using inverse kinematics.

The goal are not specified as a single pose for the end-effector. They are instead specified as goal regions, i.e., smallregions in SE3 where the robot’s end-effector should beable to move to. Goal regions provide the planners with asmall amount of tolerance in terms of determining when theyare at the goal. Planners can choose to sample individualgoal poses inside the goal regions or use the goal regionsdirectly if possible. The size and shape of the goal regionsare parameters that can be set based on the overall taskbeing executed by the robot, e.g., grasping certain objectsmight require very tight tolerances on the goal region for

Fig. 8. The collision checking models used by the two planning libraries- SBPL (left) and OMPL (right)

the planners.

B. Planners

To validate our benchmarking approach, we tested it withtwo types of motion planners: search-based planners from theSBPL library [23] and randomized planners from the OMPLlibrary [20].

The underlying search-based planner used here is theARA* planner [24]. A set of discretized motion primitivesin joint or cartesian space are used with the planner togenerate motion plans for the arms of a robot [25], [26].An inflated heuristic derived from a 3D Djikstra search inthe environment is used to accelerate this planning process.The ARA* planner provides bounds on sub-optimality withrespect to the graph that represents the planning problem.We configured the search-based planners to return the firstsolution they find and set the sub-optimality bound parameter� to 100. The ARA* planner is an anytime planner capableof improving the solutions that it finds over time but wechose not to use these capabilities of the planner. Thecost function for the search-based planner incorporated thedistance traveled in joint space.

The randomized planning approaches tested here areKPIECE [27], LBKPIECE [27], SBL [28], RRT* [29], RRTand RRT-Connect [30]. The cost function for RRT* is againthe distance traveled in joint space. RRT* was configured toreturn the first solution that was found. All these plannersare implemented in the OMPL library and use the new FCLcollision checking library [31].

It should be noted here that the performance of plannerscan depend on a large number of factors. In particular,collision checking is a critical component of motion planningand also often the most expensive. The two motion planninglibraries we benchmark use different collision representationsfor the robot(Figure 8) and also employ different approachesto the collision representation for the environment. Thecollision representation used by the search-based plannersfor representing the robot is more conservative but an orderof magnitude faster than the mesh-based representation usedfor the randomized planners. However, the representation ofthe environment for the search-based planners uses a distance

• Industrial Environment

• Narrow Passage Environment

• Kitchen Environment

• Tabletop Environment

* Cohen, Sucan, Chitta, “A Generic Infrastructure for Benchmarking Motion Planners”, IROS 2012, Portugal

Page 25: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Benchmarking

25!"

#$%&'()*+,-$./01.

More in Talk by Ryan Luna and Ioan Sucan (later today)

Page 26: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Applications: Kinematic Workspace Analysis

26

•Robot Design Evaluation

•Robot Workspace Placement

Page 27: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Kinematic Workspace Analysis

27

Page 28: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

MoveIt!

•Applications - Pick and Place❖ Integrated Grasping, Planning, Perception and Execution

28

Page 29: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

User API

•Really simple API (e.g. moving an arm):

29

move_group_interface::MoveGroup group("arm"); group.setRandomTarget(); group.move();

Page 30: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

MoveIt!

•New Architecture (different from arm navigation)✓ Minimize transport and messaging overhead - Single process for

planning and perception, shares environment representation (planning scene) vs. multiple ROS nodes each performing individual functions

❖ Computation - Core capabilities (e.g. motion planning, kinematics, etc.) are setup in C++ libraries

❖ Communication and Configuration through ROS

❖ Emphasis on speed and efficiency – parallelize collision checking, kinematics, etc.

30

Page 31: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

MoveIt!

•Capabilities (differences to arm navigation)❖ Collision Checking

✓ Parallelizable

✓ can switch between different types of collision checkers

✓ cleaner C++ interface

- Motion Planning✓ plugin based C++ interface (in addition to ROS interface)

✓ Parallelizable

✓ planning pipeline includes trajectory smoothing

31

Page 32: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Highlights

32

•Technical

- Performance❖ Single process sharing environment representation

❖ Parallelizable collision checking and kinematics

❖ Parallelizable pick and place (upcoming capability)

- Integrated Perception for Environment Representation❖ Can incorporate any source of point clouds

❖ Fast self-filtering and environment representation

- Reactive Motion Planning❖ Safer operation in collaborative environments

Page 33: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Highlights

33

•User Friendly❖ Easy configuration for new robots

❖ Graphical User Interfaces

❖ Better Visualization and Introspection

❖ Easy to use C++ API

❖ Python bindings

Page 34: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Highlights

34

•Integrated Applications❖ Collision-free Motion Planning and Execution

❖ Kinematic Analysis/ Reachability Analysis

❖ Benchmarking

•More applications in development ...❖ Pick and Place - more about this in afternoon session

Page 35: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Documentation - Wiki

35

Page 36: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Github Repository

36

Page 37: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Issue Tracking

37

Page 38: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Community

38

[email protected] - questions related to how you can use MoveIt!

Page 39: Sachin Chitta, Ioan Sucan, Acorn Pooley · Sachin Chitta, Ioan Sucan, Acorn Pooley with contributions from: Dave Coleman, Suat Gedikli, Mario Prats, Matei Ciocarlie, Kaijen Hsiao,

Where are we going?

39

- MoveIt!✓ what does it take to use MoveIt! in products✓ interest in enterprise level, supported versions of MoveIt! and associated

capabilities in ROS?