robotics, ai and machine vision...robotics, ai and machine vision jesus savagejesus savage...

62
Robotics, AI and Machine Vision Robotics, AI and Machine Vision Jesus Savage Jesus Savage Bio-Robotics Laboratory Bio-Robotics Laboratory biorobotics.fi-p.unam.mx biorobotics.fi-p.unam.mx School of School of Engineering Engineering National Autonomous University of Mexico, National Autonomous University of Mexico, UNAM UNAM 15th 15th International Conference on International Conference on Electromechanics and Robotics "Zavalishin's Electromechanics and Robotics "Zavalishin's Readings - 2020 Readings - 2020

Upload: others

Post on 24-May-2020

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

Robotics, AI and Machine Vision Robotics, AI and Machine Vision

Jesus SavageJesus Savage

Bio-Robotics LaboratoryBio-Robotics Laboratory biorobotics.fi-p.unam.mxbiorobotics.fi-p.unam.mx School ofSchool of Engineering Engineering National Autonomous University of Mexico, National Autonomous University of Mexico, UNAMUNAM

15th 15th International Conference on International Conference on Electromechanics and Robotics "Zavalishin's Electromechanics and Robotics "Zavalishin's Readings - 2020 Readings - 2020

Page 2: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

2

OUTLINE

Introduction Traditional Robotics Architectures Reactive Robotics Architectures Probabilistic Robotics Architectures Hybrid Robotics Architectures A System to Operate Mobile Robots (VIRBOT)• Video

Page 3: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

A Robot in Every Home: Overview/The Robotic Future. Bill Gates, Scientific American (2008)

Service Robots Service Robots in an Intelligent in an Intelligent HouseHouse

Introduction

Page 4: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

4

Traditional Robotics Architectures

Features:

    There is representation of the environment, with a symbolic representation of the objects in each room.

    These are represented by polygons where they have their vertices Xi, Yi, ordered clockwise. These polygons separate the occupied space and the free space where the robot can navigate.

Page 5: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

5

Traditional Robotics Architectures

Features:    Movements and actions are planned using traditional artificial intelligence techniques of searches in topological networks.

Global Path Local paths for each room

Page 6: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

6

The basic search problem:The basic search problem:

Given:Given:

1.1. Starting point (node)Starting point (node)

2.2. The goal point (node)The goal point (node)

3.3. A map of nodes and A map of nodes and connectionsconnections

Goals:Goals:

1.1. Find some path or find the Find some path or find the “best” path (maybe shortest)“best” path (maybe shortest)

2.2. Traverse the pathTraverse the path

AB C

S

DE

G

F

4

24

5

4

3

3

4

5

Traditional Robotics Architectures

Page 7: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

Artificial Intelligence Artificial Intelligence Techniques to Search for:Techniques to Search for:

1.1. Some pathSome path

2.2. Optimal pathOptimal path

Some pathSome path

Optimal PathOptimal Path

Depth-firstDepth-first

Hill climbingHill climbing

Breadth-firstBreadth-first

BeamBeam

Best-firstBest-first

British museumBritish museum

Branch and boundBranch and bound

A* A*

DijkstraDijkstra

Traditional Robotics Architectures

SearchSearch

Page 8: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

8

Traditional Robotics ArchitecturesFeatures:

    It has a serial organization, if one module fails the entire system fails.

This type of systems are not suitable for dynamic environments or for robots that have errors in movement and sensing.

Page 9: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

9

Traditional Robotics Architectures

The robot Shakey was the first example of this architecture, developed from approximately 1966 through 1972 in Stanford University.

Page 10: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

10

Reactive Robotics Architectures

Features:

Based on the behavior of insects. A representation of the environment is not

necessary. It does not use action or movement planning. It is suitable for dynamic environments and

with sensing errors. It is based on behaviors running in parallel.

Page 11: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

11

Reactive Robotics Architectures

Behaviors are represented using stimulus-response or SR diagrams.

The output of each behavior must be instantaneous from the moment there is an entry. The behaviors are independent of each other. Behaviors can be designed using zero order logic, state machines, potential fields, neural networks, etc.

Page 12: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

12

Reactive Robotics ArchitecturesZero Order Logic

Features:

    Values of the input sensors are checked and if they comply with a certain property, an output is generated, which lasts a certain time. These behaviors have no memory.

Example: If the light source is above the robot and there is an obstacle in front and on the right side, then the robot turns to the left.

Page 13: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

13

Reactive Robotics ArchitecturesReactive Robotics Architectures

Potential attractive and repulsive fields

These behaviors have no memory.

Page 14: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

14

Reactive Robotics ArchitecturesUsing Artificial Neural Networks

Reactive behaviors without memory using neural networks

Page 15: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

15

Reactive Robotics ArchitecturesUsing state machine algorithms

These behaviors have memory.

Page 16: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

16

Reactive Robotics ArchitecturesUsing Recurrent Artificial Neural Networks

These behaviors have memory.

Page 17: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

17

The SR can be combined in different structures by connecting them in parallel by adding the output of each of them or selecting one of the outputs using an arbiter.

Reactive Robotics Architectures

Page 18: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

18

Genghis, a robust hexapodal walker

Genghis, a robust hexapodal walker, is an example of this architecture, developed by Rodney Brooks in the MIT in the 80’s.

Reactive Robotics Architectures

Page 19: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

19

Probabilistic Robotics Architectures

Features:

It is based on the concept that both the sensing of the environment that the robot makes, as well as its movements are dependent on random variables, which can be manipulated using probabilistic concepts.

Page 20: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

20

Probabilistic Robotics Architectures

Stanley, an autonomous car created by Stanford University in 2005 under the direction of Sebastian Thrun, is an example of this architecture.

Hidden Markov Models (HMM), Particle Filters, Markov Decision Processes, etc. are used.

Page 21: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

21

Hybrid Robotics ArchitecturesTraditional, reactive and probabilistic architectures are

combined to replace the deficiencies of each of them.

Page 22: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

22

Hybrid model used with the Justina robot: Virtual Real Robot(ViRBot)

Page 23: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

23

INPUT LAYER

This layer process the data from the robot's internal and external sensors, they provide information of the internal state of the robot, as well as, the external world where the robot interacts.

In some of our robots designs they have lasers, sonars, infrared, microphones and stereo and RGB-D cameras.

Page 24: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

24

INPUT LAYER

Digital signal processing techniques are applied to the data provided by the internal and external sensors to obtain a symbolic representation of the data, as well as, to recognize and to process voice and visual data.

Pattern recognition techniques are used to create models of the objects, places and the persons that interact with the robot.

Page 25: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

25

INPUT LAYER

Vision System

For the recognition of objects, people and places, using RGB-D cameras, vision systems that are robust to partial occlusions, scale and rotation changes are used

Page 26: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

26

Visual Description

KD TreesDescription StorageDetection

Page 27: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

27

INPUT LAYER

By Aphex34 - Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=45679374

Convolutional neural network

Page 28: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

28

Trainning New Objects in a Deep Neural Network Using YOLO (Edgar Silva)

Page 29: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

29

INPUT LAYER

The Use of Hidden Markov Models (HMM) for the Recognition of Human Activities

Finding persons and their actions in the environment using deep neural networks(Edgar Vazquez)

Page 30: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

30

INPUT LAYERScene Classification and Understanding(Hugo Leon)

N. Silberman, “Indoor segmentation and support inference from rgbd images,” Computer Vision–ECCV 2012, pp. 746–760, 2012.

Page 31: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

31

INPUT LAYER

SCHEDULED TASKS A set of tasks that the robot needs to perform in the time

they were scheduled.

HUMAN / ROBOT INTERFACE Communication between the user and the robot can be

through voice and manual gestures. The robot responds using synthetic voice and simple facial

expressions.

Page 32: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

32

Digital Digital SignalSignal ProcessingProcessing

Speech Recognition SystemSpeech Recognition System

PerceptionPerception

Natural Natural LanguageLanguage UnderstandingUnderstanding

SpeechSpeech

INPUT LAYERHUMAN / ROBOT INTERFACESpeech Recognition System

Page 33: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

33

INPUT LAYERHUMAN / ROBOT INTERFACE

Natural Language Understanding

* "Robot, give the newspaper to Dad"* "Robot, give the newspaper to Dad"

* "Please bring me the newspaper that is there"* "Please bring me the newspaper that is there"

    (Dad is giving the order to the robot)(Dad is giving the order to the robot)

Both phrases are represented using the following Both phrases are represented using the following Conceptual Dependency primitive:Conceptual Dependency primitive:

(ATRANS (ACTOR Robot) (OBJECT newspaper) (ATRANS (ACTOR Robot) (OBJECT newspaper)

(TO Dad) (FROM newspaper-place))(TO Dad) (FROM newspaper-place))

With this representation a planner makes an action plan to With this representation a planner makes an action plan to achieve what the robot is being asked for.achieve what the robot is being asked for.

Page 34: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

34

INPUT LAYER

Digital signal processing techniques are applied to the data provided by the internal and external sensors to obtain a symbolic representation of the data.

With the symbolic representation, this module generates a series of beliefs, that represent the state of the environment where the robot interacts.

Page 35: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

35

PERCEPTIONPERCEPTIONExample: In the following figure, the symbolic representation Example: In the following figure, the symbolic representation generates the beliefs: "there is a hole in front of the robot" or generates the beliefs: "there is a hole in front of the robot" or "there is a shadow in front of the robot""there is a shadow in front of the robot"

Page 36: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

36

The beliefs generated by the perception module are validated by this layer, it uses the Knowledge Management layer to validate them, thus a situation recognition is created.

PLANNING LAYER

Page 37: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

37

Topological mapSimbolic mapRaw map

This layer has different types of maps for the representation of the environment, they are created using SLAM techniques.

Cartographer

KNOWLEDGE MANAGMENT LAYER

Page 38: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

KNOWLEDGE MANAGMENT LAYERCartographer

RGB-D Camera RGB-D Camera

CNN CNN

Estimation of Estimation of the robot’s the robot’s pose using pose using the Kalman the Kalman FilterFilter

The Kalman filter and deep neural networks are used to estimate the robot's position and orientation (Diego Cordero)

Page 39: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

39

(Human (name Mother) (room studio) (zone couch) (objects book) (pose 1.8 2.0 0.5) (locations main-bedroom))

KNOWLEDGE MANAGMENT LAYER

There is a representation of the objects, people and places where the robot interacts and the relationships that exist between them. There are 5 structures that represent objects, rooms, furniture, humans and the robot

WORLD MODEL

(Furniture (name kitchen-table) (room kitchen) (zone left-side) (pose 0.2 2.3 1.0) (use eating) (attributes fixed))

(Object (name newspaper) (room outside) (zone outside-door) (pose 0.5 1.3 0.1) (possession nobody) (use information) (locations living kitchen) (attributes movable)

)

All this information is updated by the input layer and by the actions of the robot.

Page 40: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

40

KNOWLEDGE BASE

KNOWLEDGE MANAGMENT LAYER

Using a rule-based system, CLIPS (NASA), robot knowledge is represented. This is represented by production rules, which correspond to the actions that the agent would do if certain conditions are met.

For the previous R2R2 example the following rule would be activated:

Shadow Rule { If there are trees around the robot's path and it is a clear day then there will

be a shadow in the path.}

Page 41: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

41

PLANNING LAYERSituation Validator

      

In the previous example, the cartographer does not report a hole in the area in front of the robot and the representation of knowledge reports a shadow.

Then, with these facts the new situation is that there is no danger and that the robot must continue towards its destination.

Page 42: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

42

Given a situation recognition, a set of goals are activated to solve it.

PLANNING LAYER

Activation of Goals

Action Planner

Action planning finds a sequence of physical operations to achieve the activated goals.

Page 43: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

43

Motion Planner

Local Path: Local Path:

Path inside each room. Path inside each room.

Global Path: Global Path:

Path between rooms.Path between rooms.

PLANNING LAYER

Page 44: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

44

This layer executes the actions and movements plans and it checks that they are executed accordingly.

A set of hard-wired procedures, represented by state machines, are used to partially solve specific problems, finding persons, object manipulation, etc.

The executor uses these bank of procedures to execute a plan.

EXECUTION LAYER

Page 45: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

45

Behaviour methods are used to avoid obstacles not contemplated by the movements planner.The behaviour methods are state machines, potential fields and neural networks.

EXECUTION LAYER

Behavior Methods

Page 46: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

Control algorithms, like PID, Control algorithms, like PID, areare use usedd to to control the operation of the virtual and real control the operation of the virtual and real actuators.actuators.

PID Control h[t]Vi(t) y(t)

+

-

EXECUTION LAYER

CONTROL ALGORITHMSCONTROL ALGORITHMS

Page 47: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

47

Genetic algorithmsGenetic algorithms andand programmingprogramming

PProbabilistic methodsrobabilistic methods: :

Markov chainsMarkov chains

BayesianBayesian classifiers.classifiers.

ClusteringClustering ( (K-meansK-means, Vector , Vector QuantizationQuantization))

Artificial Artificial NeuralNeural NetworksNetworks

Reinforcement LearningReinforcement Learning

KNOWLEDGE MANAGMENT LAYER

Learning

Page 48: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

48

KNOWLEDGE MANAGMENT LAYERLearning

Genetic algorithms and programming

The goal is to use an optimization algorithm, as genetic algorithms (GA), to find the best robot behaviors to avoid obstacles while they tried to reach a destination.

Page 49: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

49

Robot Learning Using Simulators

The House Of inteRactions THOR, Allen Institute for Artificial Intelligence (Adrian Sarmiento)

Gazebo 3D robot simulator integrated with a physics engine(Oscar Fuentes)

With simulated images the robot is trained to navigate in new environments while it is in resting mode.

Page 50: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

50

Robot Learning Using Simulators

THOR Simulator used to do MDP reinforcement learning for robot’s movement. (Adrian Sarmiento)

Page 51: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

Various obstacle layouts and illumination conditions for the robot to learn how toplace objects.

Robot Learning Using Simulators

Unity (Angelica Nakayama)

Page 52: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

Proposed method: PonNet (A. Nakayama)

• Pre-trained k layers of ResNet-50 as feature extractor

• Additional attention mechanism for balancing RGB and Depth

Page 53: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

53

Image Synthesis with Generative Adversarial Nets (GAN)

The generator creates synthetic images of the environment.

Training

Path

With synthetic images the robot is trained to navigate in new environments while it is in resting mode. (Future Work)

Page 54: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

54

Robots

Robot TX8 (1999)

Robot TPR8(2006)

Robot PACK-ITORobot PACK-ITO(2009)(2009)

Page 55: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

55

Robots

Robot AL-ITARobot AL-ITA(2011)(2011)

Robot JUST-INARobot JUST-INA(2012 - ...)(2012 - ...)

Robot Takeshi Toyota Robot Takeshi Toyota (2018 - ...)(2018 - ...)

Page 56: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

56

Robot Justina• Mechatronic Head (Pan and Till)

• Kinect• Stereo Cameras• Microphone

• Torso • Kinect • Laser Hokuyo

• Two arms (7 DoF)

• Mobile Base

Page 57: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

57

Blackboard

Page 58: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

BLACKBOARD ROBOT OPERATING SYSTEM (ROS)

ROS

BLKROS node

Blackboard Module

Page 59: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

59

2007 Atlanta, USA, Third Place

2018 Montreal, Canada, Second Place

2019 Sydney, Australia, Second Place

RoboCuP @Home Participation

Justina in China, 2015

Page 60: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

60

Video

Page 61: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

61

Video

Page 62: Robotics, AI and Machine Vision...Robotics, AI and Machine Vision Jesus SavageJesus Savage Bio-Robotics Laboratory biorobotics.fi-p.unam.mx School of Engineering National Autonomous

Robotics, AI and Machine Vision Robotics, AI and Machine Vision

Jesus SavageJesus Savage

Bio-Robotics LaboratoryBio-Robotics Laboratory biorobotics.fi-p.unam.mxbiorobotics.fi-p.unam.mx School ofSchool of Engineering Engineering National Autonomous University of Mexico, National Autonomous University of Mexico, UNAMUNAM

15th 15th International Conference on International Conference on Electromechanics and Robotics "Zavalishin's Electromechanics and Robotics "Zavalishin's Readings - 2020 Readings - 2020