overview of amazon picking challenge 2015 · overview of amazon picking challenge 2015 ... prior to...

9
Overview of Amazon Picking Challenge 2015 The first Amazon Picking Challenge (APC) was held at the 2015 International Conference on Robotics and Automation in Seattle Washington, May 26-27. The objective of the competition was to provide a challenge problem to the robotics research community that involved integrating the state of the art in object perception, motion planning, grasp planning, and task planning in order to manipulate real- world items in industrial settings. To that end, we posed a simplified version of the task that many humans face in warehouses all over the world: pick items out of shelves and put them into containers. In this case, the shelves were prototypical pods from Kiva Systems and the picker had to be a fully autonomous robot. The items were a pre-selected set of 24 products that were commonly sold on Amazon.com and which we expected would pose varying degrees of difficulty for the contestants. On the easier end were simple cuboids like a box of straws or a spark plug. Some items were chosen because they were easy to damage, like the two soft-cover books, or the package of crushable Oreo cookies. Others were harder to perceive and grasp, like the unpackaged dog toys or the black mesh pencil holder. The box of Cheez-Its posed a challenge because it couldn’t be removed from the bin without twisting it sideways. Each pod had 12 bins, and the 24 products were distributed among the bins in such a way that each competitor had the same challenges. Each bin had one target item to be picked with a base score of 10, 15, or 20 points depending on how many other items were in the bin. In addition, some items that were projected to be more difficult to pick were given 1 to 3 bonus points. Damaging an item incurred a 5 point penalty, while picking the wrong item incurred a 12 point penalty. Each competitor had 20 minutes to pick as many of the 12 target items as possible and could score as many as 190 points. The competition was announced October 1, 2014. Through a series of video submissions, the organizers selected 25 teams to receive equipment grants (sample pods and products) and travel grants to help defray the costs of travel to the venue. In addition, Amazon provided $26,000 in prize money for the winning teams. 26 teams from 11 countries made the trip to Seattle to try their robot’s hand at picking out of Kiva pods. The success of the teams was mixed, but the enthusiasm and excitement was contagious. The competition was won by RBO from the Technical University of Berlin. Their device, with a Barrett arm, a Nomadic Technologies mobile platform, and a suction cup attached to a commercial vacuum cleaner was able to successfully pick 10 of 12 correct items in under 20 minutes. Their score of 148 points put them well into the lead. MIT placed second with seven items picked and 88 points. Their entry used an industrial ABB arm and a scooper end-effector that could be flipped over to alternatively use a small suction cup. Third place was Team Grizzly from Dataspeed Inc. & Oakland University with 35 points. Their solution used a Baxter robot attached to their own custom mobile base. The final scores are shown in Table 1. Many teams demonstrated successful picking in their warm ups but for various reasons failed in their official 20-minute attempt. The reasons for failure varied widely, and included last minute code changes, failure to model how a vacuum hose would twist around the arm in certain poses, grippers that were so big that they couldn’t figure out how to get in the bin, etc. However, even the systems that failed to pick any items demonstrated interesting robots, end effectors, and technical approaches. Overall, 36 correct items were picked and 7 incorrect items were picked. Team Affiliation Items Picked Score RBO T.U. Berlin 10 148 Team MIT M.I.T. 7 88

Upload: lamkhuong

Post on 03-Apr-2018

215 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Overview of Amazon Picking Challenge 2015 · Overview of Amazon Picking Challenge 2015 ... Prior to ICRA 2015 each team will be responsible for coordinating transportation of their

Overview of Amazon Picking Challenge 2015

The first Amazon Picking Challenge (APC) was held at the 2015 International Conference on Robotics and Automation in Seattle Washington, May 26-27. The objective of the competition was to provide a challenge problem to the robotics research community that involved integrating the state of the art in object perception, motion planning, grasp planning, and task planning in order to manipulate real-world items in industrial settings. To that end, we posed a simplified version of the task that many humans face in warehouses all over the world: pick items out of shelves and put them into containers. In this case, the shelves were prototypical pods from Kiva Systems and the picker had to be a fully autonomous robot.

The items were a pre-selected set of 24 products that were commonly sold on Amazon.com and which we expected would pose varying degrees of difficulty for the contestants. On the easier end were simple cuboids like a box of straws or a spark plug. Some items were chosen because they were easy to damage, like the two soft-cover books, or the package of crushable Oreo cookies. Others were harder to perceive and grasp, like the unpackaged dog toys or the black mesh pencil holder. The box of Cheez-Its posed a challenge because it couldn’t be removed from the bin without twisting it sideways.

Each pod had 12 bins, and the 24 products were distributed among the bins in such a way that each competitor had the same challenges. Each bin had one target item to be picked with a base score of 10, 15, or 20 points depending on how many other items were in the bin. In addition, some items that were projected to be more difficult to pick were given 1 to 3 bonus points. Damaging an item incurred a 5 point penalty, while picking the wrong item incurred a 12 point penalty. Each competitor had 20 minutes to pick as many of the 12 target items as possible and could score as many as 190 points.

The competition was announced October 1, 2014. Through a series of video submissions, the organizers selected 25 teams to receive equipment grants (sample pods and products) and travel grants to help defray the costs of travel to the venue. In addition, Amazon provided $26,000 in prize money for the winning teams.

26 teams from 11 countries made the trip to Seattle to try their robot’s hand at picking out of Kiva pods. The success of the teams was mixed, but the enthusiasm and excitement was contagious. The competition was won by RBO from the Technical University of Berlin. Their device, with a Barrett arm, a Nomadic Technologies mobile platform, and a suction cup attached to a commercial vacuum cleaner was able to successfully pick 10 of 12 correct items in under 20 minutes. Their score of 148 points put them well into the lead. MIT placed second with seven items picked and 88 points. Their entry used an industrial ABB arm and a scooper end-effector that could be flipped over to alternatively use a small suction cup. Third place was Team Grizzly from Dataspeed Inc. & Oakland University with 35 points. Their solution used a Baxter robot attached to their own custom mobile base.

The final scores are shown in Table 1. Many teams demonstrated successful picking in their warm ups but for various reasons failed in their official 20-minute attempt. The reasons for failure varied widely, and included last minute code changes, failure to model how a vacuum hose would twist around the arm in certain poses, grippers that were so big that they couldn’t figure out how to get in the bin, etc. However, even the systems that failed to pick any items demonstrated interesting robots, end effectors, and technical approaches. Overall, 36 correct items were picked and 7 incorrect items were picked.

Team Affiliation Items Picked Score

RBO T.U. Berlin 10 148

Team MIT M.I.T. 7 88

Page 2: Overview of Amazon Picking Challenge 2015 · Overview of Amazon Picking Challenge 2015 ... Prior to ICRA 2015 each team will be responsible for coordinating transportation of their

Grizzly DataSpeed & Oakland U. 3 35

NUS Smart Hand National University of Singapore 2 32

Z.U.N.

Zhejiang U., Univ. of Tech. Sydney & Nanjiang

Robotics Co.

1

23

C^2M

Chubu U., Chukyo U., & Mitsubishi Electric

Corporation

2

21

R U Pracsys Rutgers 1 17

Team K JSK, University of Tokyo 4 15

Nanyang Nanyang Technological University 1 11

A.R. Netherlands 1 11

Team Georgia

Tech

Georgia Tech 1

10

Team Duke Duke University 1 10

CVAP KTH (Sweden) 2 9

Other teams competing included Worcester Polytechnic, University of Texas at Austin, University of Texas at Arlington, University of Washington, University of Alberta, Robological PTY LTD, Universitat Jaume I, University of Colorado at Boulder, Colorado School of Mines, University of Pisa, University of California at Berkeley, Dorabot and the University of Hong Kong, and St. Francis Institute of Technology in India. The teams were supported by several hardware vendors, including Rethink Robotics, Barrett Technologies, Yaskawa, Olympus Controls, and Clearpath Robotics.

The first APC was very successful, drawing a large number of competitors from around the world and demonstrated the state of the art in both the software and the hardware required for robotic manipulation. Despite being scattered over 16 testing bays in the ICRA competition area and spread over two days, every team drew a large crowd of spectators eager to see how the robots would perform.

Contest Details

Prior to ICRA 2015 each team will be responsible for coordinating transportation of their system to Seattle, WA. The contest organizers and ICRA challenge committee will assist with shipping details. A setup bullpen will be arranged for teams to prepare their systems, practice, and demonstrate their work to the wider ICRA audience. Prior to each official attempt, the team will be given time to set up their system in front of the test shelf. At the start of the attempt the team will be given their work order and shelf contents via the pre-defined Contest Interface. The work order and shelf content data will contain all the items that must be picked, along with a list of the other contents of the shelf bins. The robot will then be responsible for pulling as many of the items as possible out of the shelf bins and setting them down inside the robot area within the allotted time. Points will be awarded for successfully picked items, and deducted for dropped, damaged, or incorrectly picked items.

Page 3: Overview of Amazon Picking Challenge 2015 · Overview of Amazon Picking Challenge 2015 ... Prior to ICRA 2015 each team will be responsible for coordinating transportation of their

Shelving System

The shelving system will be a steel and cardboard structure seen in the CAD model below. A STL and Gazebo model of the shelf is available for download on the picking challenge website, as well as a physical copy sent to teams awarded practice equipment. Only a single face of the shelf will be presented to the teams, but the stocking of the items will be changed pseudo-randomly between attempts. Only a subset of 12 bins on the shelf face will be used. The subset is a patch of the shelf face covering bins inside a roughly 1 meter x 1 meter area. The base of the first shelf from the floor is at a height of approximately 0.78 meters. Each bin will contain one or more items. A single item in each bin will be designated as a target item to be picked. Each bin is labeled (A-Z) and referred to in the Contest Interface below. Teams will not be allowed to modify the shelf in any way prior to the competition (or damage it during their attempt).

Items

The set of potential items that will be stocked inside the bins is shown in the graphic below. A list of all the items can be found below. Note that the images on the Amazon shopping website are not necessarily representative of the form the item will be in at the contest. The graphic below shows the proper form (including things like packaging). The contest shelf may contain all the announced items, or a partial subset of them. Teams awarded practice equipment will be provided a set of the contest items. All items the teams are required to pick from the system will be a subset of this set. All items will be sized and located such that they could be picked by a person of average height (170 cm) with one hand.

http://www.amazon.com/gp/product/B0010XUO52/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B00006JN7R/ref=ox_sc_act_title_2?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B00081J3N6/ref=ox_sc_act_title_3?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B00004YO15/ref=ox_sc_act_title_4?ie=UTF8&psc=1&smid=A1T2QV8RUJMJWO http://www.amazon.com/gp/product/B00BKL52YC/ref=ox_sc_act_title_5?ie=UTF8&psc=1&smid=ATVPDKIKX0DER

Page 4: Overview of Amazon Picking Challenge 2015 · Overview of Amazon Picking Challenge 2015 ... Prior to ICRA 2015 each team will be responsible for coordinating transportation of their

http://www.amazon.com/gp/product/B002GJXSQ6/ref=ox_sc_act_title_6?ie=UTF8&psc=1&smid=A3OUBK9YC1TQAT http://www.amazon.com/gp/product/B00009OYGV/ref=ox_sc_act_title_7?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B00006IEES/ref=ox_sc_act_title_8?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B00006IEJC/ref=ox_sc_act_title_9?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B00176CGK8/ref=ox_sc_act_title_10?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B000GUZC2A/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B008JBDMO8/ref=ox_sc_act_title_2?ie=UTF8&psc=1&smid=A1G5V6RMJ146ZF http://www.amazon.com/gp/product/B000CIQ47S/ref=ox_sc_act_title_3?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B0054YZDWC/ref=ox_sc_act_title_4?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/0486280616/ref=ox_sc_act_title_5?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B000BO6RWK/ref=ox_sc_act_title_6?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B000IVGBFY/ref=ox_sc_act_title_7?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B000E7AIA6/ref=ox_sc_act_title_8?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B000SHQ73Y/ref=ox_sc_act_title_9?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B0096XTJKO/ref=ox_sc_act_title_10?ie=UTF8&psc=1&smid=A3G2RBEZBLAJ53 http://www.amazon.com/gp/product/B000MVYJ0O/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B0006L0VMA/ref=ox_sc_act_title_2?ie=UTF8&psc=1&smid=A2701FG002WN6Z http://www.amazon.com/gp/product/B000N0SNHY/ref=ox_sc_act_title_3?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B008PNN8C6/ref=ox_sc_act_title_4?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/0800788036/ref=ox_sc_act_title_5?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B003YOON0C/ref=ox_sc_act_title_6?ie=UTF8&psc=1&smid=ATVPDKIKX0DER http://www.amazon.com/gp/product/B000Q3KHCM/ref=ox_sc_act_title_7?ie=UTF8&psc=1&smid=ATVPDKIKX0DER

Page 5: Overview of Amazon Picking Challenge 2015 · Overview of Amazon Picking Challenge 2015 ... Prior to ICRA 2015 each team will be responsible for coordinating transportation of their

The Amazon Picking Challenge team at Berkeley has provided a wealth of 2D and 3D data on the contest items by running them through the scanning system described in their ICRA 2014 work. Detailed models and raw data can be downloaded at:

http://rll.berkeley.edu/amazon_picking_challenge/ Bin Stocking

For each attempt the bin location of the items will be changed. There will be bins that contain a single item, and some that contain multiple items. Each bin will have only a single item that is a pick target. The bin contents will be fully described by the contest Interface (below). The orientation of items within the bin will be randomized (e.g. a book may be lying flat, standing vertically, or have its spine facing inward or outward). Items will not occlude one another when looking from a vantage point directly in front of the bin. Contestants can assume the following rough breakdown of bin contents:

• Single-item bins: At least two bins will only contain one item (single-item bins). Both these items will be picking targets.

• Double-item bins: At least two bins will contain only two items (double-item bins). One item from each of these bins will be a picking target.

• Multi-item bins: At least two bins will contain three or more items (multi-item bins). One item from each of these bins will be a picking target.

Duplicate items may be encountered. In the event that an item appears in the work order for a bin that contains two of the same item in it either item can be picked (but not both).

Page 6: Overview of Amazon Picking Challenge 2015 · Overview of Amazon Picking Challenge 2015 ... Prior to ICRA 2015 each team will be responsible for coordinating transportation of their

Workcell Layout The shelf will be placed in a fixed known stationary position prior to the start of the attempt and the team allowed to perform any calibration and alignment steps prior to the start to their timed attempt (within the reasonable bounds of the contest setup time). The precise alignment of the shelf with respect to the robot area will be subject to slight deviations for natural reasons (and not intentionally mis-aligned by the contest organizers). The team is not allowed to move or permanently modify the shelf. If a team translates the base of the shelf by more than 12 cm the attempt will be stopped. An overhead view of the workcell layout can be seen in the diagram below.

The team is allowed to place their robot anywhere within the predefined robot workcell, but must have a starting gap of at least 10 cm from the shelf. The robot should be kept within the 2 meter X 2 meter square robot workcell for practical set-up & tear-down purposes at the competition, but exceptions will be considered for larger systems. Each robot must have an emergency-stop button that halts to motion of the robot. 110 Volt US standard power will be provided in the robot workcell.

Items need to be moved from the shelf into a container in the robot workcell referred to as the order bin. Teams may put the order bin anywhere they prefer. The order bin is a plastic tote available from the company U-Line (http://www.uline.com/Product/Detail/S-19473R/Totes-Plastic-Storage-Boxes/Heavy-Duty-Stack-and-Nest-Containers-Red-24-x-15-x-8?model=S-19473R&RootChecked=yes). Tables and other structures to support the bin can be brought with the team equipment. Several table options will be provided on-site by the organizers. The robot must not be supporting the order bin at the beginning or end of your challenge period, but it is allowed to pick up and move the bin. If you end your run while still supporting the bin, your team will receive only half the points of the items in the bin as your final score. If you drop the bin, it will count as dropping all of the items in the bin. If a team has a question as to what might constitute 'supporting' the bin, or other related questions, contact the mailing-list or committee and we can clarify for your situation. Accidentally moving the order bin within the robot workcell will not be penalized. Moving the order bin outside the workcell will result in no points being awarded for the objects in the bin.

Contest Interface

The contest Interface will consist of a single JSON file handed out to each time prior to the start of their attempt that defines two things:

• The item contents of all bins on the shelf face • The work order detailing what item should be picked from each bin

Rules

The first day of the contest will be used for practice attempts. Teams will be scored during practice, but the outcome will not count toward prize evaluations. A schedule for scored attempts will be distributed prior to the contest.

An attempt is defined as a single scored run. The attempt starts when the team is given the Contest Interface file, and the attempt ends when one of the following conditions is met: the 20 minute time limit expires, the leader of the team verbally declares the run is complete, or a human intervenes (either remotely or physically) with the robot or shelf.

Page 7: Overview of Amazon Picking Challenge 2015 · Overview of Amazon Picking Challenge 2015 ... Prior to ICRA 2015 each team will be responsible for coordinating transportation of their

One hour prior to the start of an attempt the team will be given access to setup their system at the contest shelf. Ten minutes prior to the start of the attempt the organizers will obscure the contestants' view of the shelf and randomize the items on the shelf. Then, (2 minutes prior to the start of the attempt) the organizers will give the team a JSON file containing the shelf layout and work order as described in the Interface. The team will then be allowed to upload the data to their system, after which the shelf will be unveiled and the attempt started. Items that appear in the work order can be picked in any order.

No human interaction (remote or physical) is allowed with the robot after uploading the work order and starting the robot. Note that this precludes any teleoperation or semi-autonomous user input to the robot. Human intervention will end the attempt, and the score will be recorded as the score prior to the intervention. Each team will have a maximum time limit of 20 minutes to complete the work order. The team is free to declare the attempt over at any time prior to the end of the 20 minute period. The number of attempts each team is allowed will be based on the number of participants (and decided prior to the contest). In the event teams are allowed multiple attempts the highest score from each team will be used to determine the contest outcome.

Robots that are designed to intentionally damage items or their packaging (such as piercing or crushing) will be disqualified from the contest. Questionable designs should be cleared with the contest committee prior to the competition.

Scoring

Points will be awarded for each target item removed from the shelf, and points will be subtracted for all penalties listed below. The points awarded vary based on the difficulty of the pick.

Moving a target item from a multi-item shelf bin into the order bin +20 points

Moving a target item from a double-item shelf bin into the order bin +15 points

Moving a target item from a single-item shelf bin into the order bin +10 points

Target Object Bonus +(0 to 3)

points

Moving a non-target item out of a shelf bin (and not replacing it in the same

bin) -12 points

Damaging any item or packaging -5 points

Page 8: Overview of Amazon Picking Challenge 2015 · Overview of Amazon Picking Challenge 2015 ... Prior to ICRA 2015 each team will be responsible for coordinating transportation of their

Dropping a target item from a height above 0.3 meters -3 points

Moving a target item from a multi-item shelf bin into the order bin: A target object picked from a shelf bin that contains 3 or more total items. The item must be moved into the order bin. Moving a target item from a double-item shelf bin into the order bin: A target object picked from a shelf bin that contains 2 total items. The item must be moved into the order bin. Moving a target item from a single-item shelf bin into the order bin: A target object picked from a bin that contains only that item. The item must be moved into the order bin. Target Object Bonus: An added point bonus uniquely specified for each different object. The bonus points are added to the team's score for each target object that is successfully put in the order bin.. Moving a non-target item out of a bin (and not replacing it): A penalty is assigned for each item removed (or knocked out of) the shelf that is not a target item. If the item is placed back on the shelf in the same bin (by the robot) there is no penalty. Damaging any item or packaging: A penalty is assigned for each item that is damaged (both target and non-target items). Damaging will be assessed by the judges. An item can be considered damaged if a customer would not be likely to accept the item if it was delivered in that condition. Dropping a target item from a height above 0.3 meters: A penalty is assigned for an item dropped from too high a height. Note that this only applies to target items.

For example: A team successfully picks two items out of single-item bins (+10 +10), and one item out of a multi-item bin (+20). The first object was worth +0 bonus points, the second worth +3, and the third worth +1. However, in the process of picking they knock another item out of its bin (-12) and it falls to the floor. Also, when they go to place their first item down in the robot workspace it slips from the hand 0.5 meters from the ground (-3). This would result in the team having a total score of 29 points.

In the event of a point tie the team that ended their run first (the team with the shortest time) will be declared the winner. In the event that two teams are tied on points and time they will split the prize. The omission of time as a critical challenge metric is intentional. In this first year of the picking challenge we want to encourage participants to focus on picking as many items as possible, rather than picking a select few items as fast as possible.

Prizes

To motivate contestants to focus on the full end-to-end task several prizes will be awarded to those with the best performance (as judged by the contest Rules). In order to participate for the prize awards teams must register their participation and sign the contest agreement by the contest enrollment closure date. The top three teams will receive monetary prizes as listed below:

Prize (USD) Minimum Score for Full Prize

1st Place $20,000.00* 35 points

2nd Place $5,000.00* 25 points

Page 9: Overview of Amazon Picking Challenge 2015 · Overview of Amazon Picking Challenge 2015 ... Prior to ICRA 2015 each team will be responsible for coordinating transportation of their

3rd Place $1,000.00* 15 points

* All monetary prizes are conditional on eligible teams completing the minimum score criteria for that prize as outlined below. In the event of a scoring and time draw the tied teams will split the prize for their position. Prizes will not be awarded to teams with scores less than or equal to 0 points.

Minimum Score Criteria

To obtain the full 1st - 3rd place prizes a team must meet a minimum score criteria. If a team does place in the prize pool but does not meet the minimum criteria for that prize they will be awarded half the amount of the prize for that place. For example, if a team places 1st in the competition, but does not meet the minimum 1st place score criteria, they will be awarded a $10,000 prize.