official report - university of florida...walkthrough . this report will cover everything from the...

22
Official Report FALL IMDL 2015 – EEL4665 Designer: Jesus Pintado Robot Name: Ballsy Instructors & TA’s: Dr. A. Antonio Arroyo Dr. Eric M. Schwartz Andrew Gray Jacob Easterling

Upload: others

Post on 26-Sep-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Official Report

FALL IMDL 2015 – EEL4665 Designer: Jesus Pintado

Robot Name: Ballsy Instructors & TA’s:

Dr. A. Antonio Arroyo Dr. Eric M. Schwartz

Andrew Gray Jacob Easterling

Page 2: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Table of Contents

Executive Summary……………………………………………………………………………………………………………………3

Walkthrough……………………………………………………………………………………………………………………………..4

Vision…………………………………………………………………………………………………………………………………………4

Objective..………………………………………………………………………………………………………………………………….4

Operation..…………………………………………………………………………………………………………………………………4

Hierarchy……………………………………………………………………………………………………………………………………5

Vision Processing…………………………………………………………………………………………………………………….…6

Communication………………………………………………………………………………………………………………………...8

Mobile Platform……………………………………………………………………………………………………………………...…9

Actuation…………………………………………………………………………………………………………………………………..9

Sensors……………………………………………………………………………………………………………………………………11

Conclusion………………………………………………………………………………………………………………………………12

Appendix…………………………………………………………………………………………………………………………………13

Page 3: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Executive Summary We live in a world where the intelligence of robots and their ability to perform task better than humans is becoming apparent. Even if they can’t perform the task as quick as humans they are certainly a lot cheaper in the longer run and can work 24 hours a day 7 days a week if equipped with the proper tools. I have created a robot that is designed to collect either tennis balls or golf balls for real life application. To achieve his task, Ballsy had to be equipped with a sense of sight, the ability to move, and a means to grab objects and carry them.

Sight

For anything to understand its environment accurately it must have some kind of vision. Ballsy is equipped with a webcam and uses computer vision to extract information from its environment and act accordingly. Computer vision is made possible by the use of the OPENCV library. Every image received is filtered for a color and shape of interest. Additionally Ballsy equipped with ultrasonic sensors to sense when it’s getting close too objects to avoid a collision. The webcam provides a sense of direction while the ultrasonic sensors provide the ability for Ballsy to navigate its environment safely without destruction or harm.

Travel

To get any work done energy is required and part of that energy is used in travel. Ballsy is equipped with two DC motors that are attached to wheels opposite from one other. Adjacent to these motors is a pair of caster wheels for balance. Using the feedback from the webcam the micro-processor controlling the motors decide what direction Ballsy will travel. Additionally the DC motors have encoders which I use to make sure Ballsy doesn’t get stuck. Sure Ultrasonic sensors can detect objects but they are prone to blind spots. Encoders cannot detect object but tell me how fast the wheel is rotating. If the wheels are rotating too slowly then Ballsy is likely stuck and changes his direction of movement.

Grasp

The claw on Ballsy uses 2 servo motors. One is to rotate the arm just like you would rotate your wrist and the other it to open and close the claw. As you can guess, this is why the target must be lined up with the center of the robot because as soon as the claw rotates it could knock the target out of place if not lined up properly. Once the target is collected it is dropped into a mobile container.

Ballsy has two micro-controllers, an Arduino Mega 2560 (slave) and a Raspberry Pi 2 Model B (master). The master processes all the high level decisions and processes the images received from the camera. The slave controls the motors, reads sensor data, and receives instructions from its master. This document will go into detail on the idea behind Ballsy, parts used, algorithms implemented, and basic overall structure of the system.

Page 4: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Walkthrough This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him. This includes everything from a basic list of parts to complex code algorithms implemented.

Vision Every week golf course and tennis court owners write a check to their employees. Part of that check is payment for simply picking up balls and storing them. What if there was a robot that can do that and owners never have to write a check for that again? Well, Ballsy is the answer. Not only this will cut costs on employees but also on the equipment used to collect these balls. The best part about him is that Ballsy is rechargeable, does exactly what he’s told, gives no back talk and pays off handsomely in the long run.

Objective Ballsy is an autonomous robot that uses computer vision and ultrasonic sensors to approach balls and collect them.

Operation Stage 1: Seek

• In the seek stage Ballsy roams around its environment looking for circular objects with unique characteristics. Ballsy is capable of identifying an object of specific size shape and color.

• Once the desired object is identified Ballsy begins to approach it, otherwise he will remain searching till he’s convinced that its job is done.

Stage 2: Extraction

• During the approach Ballsy continuously calculates the targets position relative to his center. • Using a strategically placed ultrasonic sensor, eventually the sensor will detect the ball in front of

it and trigger the extraction. • The extraction is executed by a “claw like” arm that rotates towards the target, closes the claw,

and returns to its respecting position where he drops the ball onto a mobile container.

During operation Ballsy is in either stage 1 or stage 2. If he has gone an x amount of time without detecting a desired object he shuts down to reserve power.

Page 5: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Hierarchy Ballsy is equipped with two micro-controllers, Raspberry Pi 2 Model B+ and the Arduino Mega 2560. The Raspberry Pi is the master of the robot. This means that every decision the robot makes comes from the process of the master. The Arduino Mega (the slave) controls all the sensors and actuators based on the information it receives from its master. The following block diagram depicts the authority of the system.

Figure 1. Hierarchy Diagram

Webcam Arduino Mega

Raspberry Pi 2

Servo Motor

DC Motor Ultra-sonic Sensor

LCD Display

Page 6: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Vision Processing For vision processing I used the Raspberry Pi 2 Model B. In order for me to even begin the actual vision processing I had to learn Linux, Python and finally OPENCV because the Raspberry Pi 2 is has a Linux operating system and comes pre-equipped with Python. Because the Raspberry Pi is the device doing the vision processing, obviously the PS3 Eye webcam is connected to it via USB. If you would like to see the code for my computer vision algorithm refer to the coding appendix.

The vision processing uses a feedback based algorithm where it relies on previous data to decide which way it’s going to go. You will see next page that the feedback is an integral part for the robot to complete its task. Below is a diagram on how to vision processing algorithm was implemented. Note that xold is initializes to 0 at the start of the program. The next page better illustrates the feedback.

YES NO

YES NO

Figure 2. Vision Processing Flowchart

NO YES

Obtain Image Convert to HSV

Filter Colors Erode Dilate

Bit-wise and Gaussian-Blur Hough Circles

Circle detected?

Calculate centroid x = centroid x coordinate

x <= 180 & x >= 120?

xold = x

Move forward xold > x?

Turn right Turn left

Page 7: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

x=180 x=120

(0,0)

x=180 x=120

(0,0)

Forward condition:

Figure 3. Forward Example

So I have conditional statements setup in my loop which one of them are to determine whether to go forward. The columns you see within the orange rectangle determine whether or not the direction must be changed. As long as x and xold is between those two columns, the robot will continue to move forward.

Left turn condition:

Figure 4. Left turn example

(320,240)

x=140

xold=170

(320,240)

xold=140

x=90

Page 8: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

x=180 x=120

The left turn example illustrated in figure 3 is how the robot decides whether or not to turn left. So long as xold is greater than x the robot will turn left till eventually x and xold are within the two columns shown above which will lead to the forward example mentioned before.

Right turn condition:

Figure 5. Right turn example

The condition for Ballsy to turn right is only if x is greater than xold. If you would like to see how this was implemented to the source code in the appendix for the Raspberry Pi 2 Model B.

In order for this algorithm to work, the webcam must be strategically placed where the two columns in the image are within a range that the claw arm can reach. Once Ballsy is close enough the ultra-sonic sensor in the center will begin detecting the ball. Once the values that the ultra-sonic sensors are returning a value within a certain range the claw will extend for the pickup.

Communication In order for the slave to know what to do the master has to be able to communicate with it. The connection is established using serial communication via USB. Hardwire “wise”, I simply connected a USB cable from my Arduino to one of the available USB ports on the Raspberry Pi 2 Model B. I used the Pyserial library for python on the Raspberry Pi to setup serial communication on the Pi. Depending on what condition is present at the time, say the left turn condition, then the Pi will send an ‘L’ to the slave and the slave will respond accordingly.

(320,240)

(0,0)

x=230

xold=170

Page 9: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Mobile Platform If this product were to ever be commercialized I would have designed it very differently. However since this is just a prototype to prove an idea, I tried to make it as simple and compact as possible for ease of transportation and storage. This section of the document will describe the design and parts equipped.

Actuation

Ballsy uses a 6V 75:1 gear motor to navigate

Motor Specs:

• Size: 25D x 66L mm • Weight: 104 g • Shaft diameter: 4 mm • Gear ratio: 74.83:1 • Free-run speed @ 6V: 82 rpm • Free-run current @ 6V: 150 mA1 • Stall current @ 6V: 2400 mA • Stall torque @ 6V: 85 oz·in • Lead length: 8 in2 • Motor type: 2.4A stall @ 6V (LP 6V) Figure 6. DC Motors

These motors where chosen so that Ballsy can complete his task relatively quickly.

To control the motors a dual motor driver for Arduino is used.

Motor Driver Specs:

• Size: 2.0″ × 0.56″ • Weight: 2.3 g • Motor driver: DRV8835 • Motor channels: 2 • Minimum operating voltage: 1.5 V2 • Maximum operating voltage: 11 V • Continuous output current per channel: 1.2 A3 • Peak output current per channel: 1.5 A • Continuous paralleled output current: 2.4 A3 • Maximum PWM frequency: 250 kHz • Reverse voltage protection?: Yes Figure 7. Dual Motor Driver

Page 10: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Ballsy uses a 6v Hitec servo motor to rotate the claw.

Hitec Servo HS-645MG Specs:

• Motor Type: 3 Pole • Bearing Type: Dual Ball Bearing • Speed (4.8V/6.0V): 0.24 / 0.20 • Torque oz./in. (4.8V/6.0V): 107 / 133 • Torque kg./cm. (4.8V/6.0V): • 7.7 / 9.6 • Size in Inches: 1.59 x 0.77 x 1.48 • Weight ounces: 1.94 • Weight grams: 55

Figure 8. Hitec Servo

To open and close the claw a vex shaft motor was mounted on the claw

Vex 3-Wire Servo:

• Free Speed:100 rpm @ 7.5 volts • Stall Torque: 6.5 in-lbs • Voltage: 4.4 - 9.1 Volts • PWM Input: 1ms - 2ms • Dead Band: 1.47ms - 1.55m

Figure 9. Vex Motor Side View

Figure 10. Vex Motor Top View

Page 11: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Sensors

Ballsy uses ultrasonic range finders to avoid obstacles.

Ultrasonic Sensor

• Power Supply :+5V DC • Quiescent Current : <2mA • Working Currnt: 15mA • Effectual Angle: <15° • Ranging Distance : 2cm – 400 cm • Resolution : 0.3 cm • Measuring Angle: 30 degree • Trigger Input Pulse width: 10uS • Dimension: 45mm x 20mm x 15mm Figure 11. Ultrasonic Sensor

To account for the blind spots of the sensors the DC motors are equipped with encoders.

48 CPR Encoder:

• Voltage Range: (3.5-20V) • 2 Encoders Outputs (out of phase 90 degrees) • 48 counts per revolution

The encoders output a value based on how fast the wheel is rotating. If the value goes above a certain value that means the wheels are rotating too slow and Ballsy is stuck.

Figure 12. Encoder

Behavior One of the features that were important to me while creating Ballsy was his ability to make smart decision. After the boot up process Ballsy begins to rotate counter clock wise till a ball is found. If a balls is not found after an x amount of time, Ballsy begins navigating around the room find a target within his range of sight. Even then, if he does not find a target within a y amount of time while roaming then he is convinced his job is done and shuts down to conserve power. Additionally at time the lining up of the ball might not be 100% accurate so the claw would not miss its target but it would knock it out of place. Which is why every time an attempt to collect occurred, Ballsy took a moment to scan his proximity to ensure that the target was collected. To see how these features were implemented in code refer to the appendix.

Page 12: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Conclusion Ultimately I designed a robot that is able to scan its environment for targets and approach them effectively to collect. However the range Ballsy had was limited due to the resolution required to process the computer vision. Since the resolution of the image had to be downsized, my computer vision program is only able to detect an object from up to a 5 feet radius given optimal lighting and conditions. While this is not an issue to prove an idea, it is certainly an issue if it were to be a real life application.

One of the features I was happy with was the claw. Many arm like features on robots are slow and or inaccurate. Ballsy can approach its target relatively fast and scoop it up in a matter of seconds. This was something I was not expecting because of the research I did; arm like features I came across were not very fast.

Future Work

If I were to restart my project the first thing I would do is learn solid works. One of the obstacles I faced while creating Ballsy was crafting a platform. I had no idea how hard it was to cut straight line before I started creating Ballsy. Therefore I would learn solid works to design a more complex and clean cut frame to take advantage of the machinery available to me in lab.

Page 13: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Appendix Raspberry Pi 2 Vision Processing Code:

import numpy as np

import cv2

import os

import serial

import time

# setup serial communication

ser = serial.Serial(port = "/dev/ttyACM0",baudrate = 115200,bytesize = 8,timeout = 1)

count = 100 # count variable used for initial if condition later

cap = cv2.VideoCapture(0) #declare object for video camera

cap.set(3,320)

cap.set(4,240)

# downsize resolution form 640x480 to 320x240

lower = np.array([7,160,139],np.uint8)

upper = np.array([17,255,255],np.uint8)

# HSV range for color red

time.sleep(2) # Arduino restarts when serial communication is established, 2 sec to account for that

ser.write('r') # sending 'r' to arduino indicating that this program is ready to run

time.sleep(1) # wait 1 second before loop is entered to ensure arduino is ready

while cv2.waitKey(1) != 27 and cap.isOpened(): # while webcam connection established

read, frame = cap.read()# read gets true or false on whether frame was succesfully read

# convert to hsv to filter out unwanted colors

hsv = cv2.cvtColor(frame,cv2.COLOR_BGR2HSV)

Page 14: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

mask = cv2.inRange(hsv,lower,upper)

# erode to remove noise

mask = cv2.erode(mask,None,iterations=1)

# dilate to fill in gaps in desired color

mask = cv2.dilate(mask,None,iterations=2)

# bitwise and to create a binary image

mask = cv2.bitwise_and(mask,mask)

# smooth object

mask = cv2.GaussianBlur(mask,(5,5),0)

# extract circle from masked image

circles = cv2.HoughCircles(mask,cv2.HOUGH_GRADIENT,2,200,param1=40,param2=20,minRadius=5,maxRadius=0)

if circles is not None: # enter if circle was detected

# drawing circle on image

for i in circles[0,:]:

cv2.circle(frame,(int(round(i[0])),int(round(i[1]))),int(round(i[2])),(0,255,0),2)

center = cv2.circle(frame,(int(round(i[0])),int(round(i[1]))),2,(0,0,255),1)

# calculating centroid of circle

M = cv2.moments(mask)

x = int(M['m10']/M['m00'])

y = int(M['m01']/M['m00'])

if count == 100:

xold = x

print 'new loop'

print 'xold = ',xold

print 'xnew = ',x

print 'count = ',count

Page 15: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

if x <= 195 and x >= 125: #if detected circle is within a certain range in the image

if xold >= 195 or xold <= 125: #to avoid sending out consecutive serial data

ser.write('a')

count = 1

print 'count set, a sent'

# for debugging cv2.waitKey(0)

elif count == 100: #this condition is just for the very first time an image is picked up

ser.write('a')

count = 1

print 'count set, a sent'

# cv2.waitKey(0)

elif count == 1:

count = 0

if xold > x:

print 'count zeroed, s sent'

ser.write('s')

else:

ser.write('d')

print 'count zeroes, d sent'

# cv2.waitKey(0)

xold = x

# cv2.imshow('mask',mask)

# cv2.imshow('frame',frame)

cap.release()

cv2.destroyAllWindows()

Page 16: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

Arduino Mega Actuator and Sensor Code:

#include <LiquidCrystal.h>

#include <Servo.h>

LiquidCrystal lcd(34,35,36,38,39,40,41,42,43,44,45); // for LCD screen

// m2 left motor, m1 right motor

#define m1_enable 9 // pwm pin to determine speed of M1

#define m1_phase 7 // phase pin to determine direction of M1... (HIGH = forward)

#define m2_enable 10 // pwm pin to determine speed of M2

#define m2_phase 8 // phase pin to determine direction of M 2 ... (LOW = forward)

#define m1_encoder_A 24 // encoder A output pin for m2, yellow wire on left motor

#define m1_encoder_B 25 // encoder B output pin for m2, white wire on left motor

#define m2_encoder_A 22 // encoder A output pin for m2, yellow wire on left motor

#define m2_encoder_B 23 // encoder B output pin for m2, white wire on left motor

#define trigPin1 27 // trig pin for sonar 1 (right most sonar)

#define echoPin1 26 // echo pin for sonar 1

#define trigPin2 29 // trig pin for sonar 2 (middle sonar)

#define echoPin2 28 // echo pin for sonar 2

#define trigPin3 31 // trig pin for sonar 3 (left most sonar)

#define echoPin3 30 // echo pin for sonar 3

Servo wrist;

Servo vex;

int incomingByte = 0;

int middleS;

float count = 0;

int state = 0; // 0 = searching, 1 = pursuing

void setup() {

Page 17: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

// setting pins for motor drivers as outputs

Serial.begin(115200);

lcd.begin(16,2);

//delay(1000);

lcd.clear();

lcd.print("preparing");

//delay(500);

pinMode(m1_encoder_A,INPUT);

pinMode(m1_encoder_B,INPUT);

pinMode(m2_encoder_A,INPUT);

pinMode(m2_encoder_B,INPUT);

pinMode(trigPin1,OUTPUT);

pinMode(trigPin2,OUTPUT);

pinMode(trigPin3,OUTPUT);

pinMode(echoPin1,INPUT);

pinMode(echoPin2,INPUT);

pinMode(echoPin3,INPUT);

wrist.attach(3);

vex.attach(4);

retract();

Close();

while(incomingByte!='r') // waiting for ready signal from RPi

{

if(Serial.available() > 0){

incomingByte = Serial.read();

if(incomingByte=='r'){

lcd.clear();

lcd.print("entered loop");

Page 18: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

}

}

}

}

int readSonar1(){

digitalWrite(trigPin1,HIGH); // sending a 10us pulse to trigger sonar1

delayMicroseconds(10);

digitalWrite(trigPin1,LOW);

int val1 = pulseIn(echoPin1,HIGH); // storing echoed pulse

val1 = val1/58; // converts input echo to cm

return val1;

}

int readSonar2(){

digitalWrite(trigPin2,HIGH); // sending a 10us pulse to trigger sonar1

delayMicroseconds(10);

digitalWrite(trigPin2,LOW);

int val1 = pulseIn(echoPin2,HIGH); // storing echoed pulse

val1 = val1/58; // converts input echo to cm

return val1;

}

int readSonar3(){

digitalWrite(trigPin3,HIGH); // sending a 10us pulse to trigger sonar1

delayMicroseconds(10);

digitalWrite(trigPin3,LOW);

int val1 = pulseIn(echoPin3,HIGH); // storing echoed pulse

Page 19: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

val1 = val1/58; // converts input echo to cm

return val1;

}

void left(int d1,int d2){

digitalWrite(m1_phase,LOW); // right motor forward

digitalWrite(m2_phase,LOW); // left motor reverse

analogWrite(m1_enable,d1);

analogWrite(m2_enable,d2);

}

void right(int d1,int d2){

digitalWrite(m1_phase,HIGH); // right motor reverse

digitalWrite(m2_phase,HIGH); // left motor forward

analogWrite(m1_enable,d1);

analogWrite(m2_enable,d2);

}

void forward(){

digitalWrite(m1_phase,LOW); // right motor forward

digitalWrite(m2_phase,HIGH); // left motor forward

analogWrite(m1_enable,64);

analogWrite(m2_enable,64);

}

void reverse(){

digitalWrite(m1_phase,HIGH); // right motor forward

digitalWrite(m2_phase,LOW); // left motor reverse

analogWrite(m1_enable,128);

analogWrite(m2_enable,128);

}

Page 20: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

void motor_off(){

analogWrite(m1_enable,0);

analogWrite(m2_enable,0);

}

int m1_encoder(){

int e1 = pulseIn(m1_encoder_B,HIGH);

return e1;

}

int m2_encoder(){

int e2 = pulseIn(m2_encoder_B,HIGH);

return e2;

}

void retract(){

wrist.writeMicroseconds(2500);

}

void extend(){

wrist.writeMicroseconds(750);

}

void Open(){

vex.writeMicroseconds(700);

}

void Close(){

vex.writeMicroseconds(2200);

}

Page 21: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

void loop(){

if(Serial.available() > 0) // if signal received from Raspberry Pi

{

state = 1;

incomingByte = Serial.read();

if(incomingByte == 'a'){ // if Pi sends an 'a', move forward

forward();

lcd.setCursor(0,1);

lcd.print("F received");

}

else if(incomingByte == 's'){ // if Pi sends an 's', turn left

left(50,0);

lcd.setCursor(0,1);

lcd.print("L received");

}

else if(incomingByte == 'd'){ // if Pi sends a 'd', turn right

right(0,50);

lcd.setCursor(0,1);

lcd.print("R received");

}

}

count++;

if(count == 10000) // check middle sensor everytime count reaches 10000

{

count = 0;

middleS = readSonar2();

lcd.setCursor(0,0); // set cursors of LCD to the top left

Page 22: Official Report - University of Florida...Walkthrough . This report will cover everything from the point where I first thought about creating Ballsy to the point where I finished him

lcd.print("middleS = ");

lcd.print(middleS);

lcd.print("...");

if(middleS <= 16 & middleS > 12){ // if sonar is between these values move arm

state = 0; // ball is being lifted

Serial.end(); // stop serial communication momentarily

motor_off();

Open();

delay(1000); // delay to ensure arm is finished moving that one motion

extend();

delay(1000);

Close();

delay(1000);

retract();

delay(1000);

Open();

delay(1000);

Close();

}

Serial.begin(115200); // re-enable serial communication

}

if(state == 0) // search mode

{

left(75,0);

}

// int rightS = readSonar1();

// int middleS = readSonar2();

// int leftS = readSonar3();

// int e1 = m1_encoder(); // reading m1 encoder value

// int e2 = m2_encoder(); // reading m2 encoder value

}