ultrasonic radar system (uras): arduino and virtual ...ieeeprojectsmadurai.com/emp basepaper...

10
1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEE Sensors Journal > Sensors-17234-2017.R1: Accepted< 1 AbstractRecently, academia and industry have focused on the creation of new systems for mapping and exploration of unknown spaces in order to create advanced guide systems for robots and people affected by disabilities. In particular, the most common applications are related to the exploration of unknown and/or dangerous spaces that are not accessible to people by exploiting the advantages offered by ultrasonic technology. This work aims at designing a new low-cost system, namely Ultrasonic Radar System (URAS), to blindly map environments by using ultrasonic sensors, then displaying the acquired information through an Android- based device. Index TermsArduino, Android, Environmental Mapping, Internet of Things, Ultrasonic Radar System, Virtual Reality. I. INTRODUCTION NTERNET of Things (IoT) usually refers to a set of technologies (e.g. sensors, tags, mobile devices, and communication technologies) to design and create advanced and complex systems that aim at improving the quality of life [1]-[2]. Through the paradigms and the standardization of IoT [3], it is possible to develop ad-hoc hardware systems where physical components communicate and cooperate, in order to reach a common goal [4]. Recently, new wearable devices, incorporating advanced electronic technologies, have been developed under the framework of IoT, to provide practical functions and features to aid people in their everyday life. In particular, several solutions have been developed by Academia to improve the quality of life of blind and visually impaired people. For instance, Rey et al. in [5] propose a wearable device based on both a cheap ultrasonic sensor for the detection of obstacles and a vibrotactile feedbacks to warn the end-user about the detected obstacles. Several works design novel IoT solutions based on ultrasonic technology, serving several purposes such as: the detection of obstacles; the tracking of human and robot’s behavior in an environment; and the mapping of environments. More recently, several applications based on ultrasonic technology have been developed to improve the mapping and localization of a robot module for the exploration of unknown and/or dangerous spaces that cannot be accessed by humans. As a matter of fact, ultrasonic sensors have become a popular measurement tool because of both their simplicity and affordability. A great deal of research efforts have focused on developing systems that can be remotely controlled to explore an environment. For instance, Correra et al. in [6] design an autonomous navigation system based on two parts: a navigation system based on the Microsoft Kinect device [7]; an artificial neural network (ANN) deployed on a laptop integrated in the navigation system to recognize different configurations of the environment. Similarly, Bergeon et al. in [8] develop a 3D mapping system by creating a robot module based on the Kinect device and a 2D laser sensor to increase the accuracy of the system. Even though these systems provide interesting solutions to map environments, they require a Kinect sensor and a laptop in order to process the acquired information. For these reasons, several works focus on compact sensors and microcontrollers to create ad-hoc systems for mapping and/or localization purposes. For instance, Papa et al. in [9] propose a procedure to create a 3D virtual map by using a sonar and allowing an Unmanned Aerial Vehicle (UAV) system to land safely. This paper aims at providing a new low-cost, versatile system based on ultrasonic technology and the Arduino Nano platform [10] to blindly map indoor environments with the aid of an Android-based mobile device. In particular, the proposed system, namely Ultrasonic Radar System (URAS), is composed of four components: (i) the independent module that is the core of the URAS and based on an ultrasonic sensor connected to an Arduino Nano that forwards the acquired data to the mobile application; (ii) the robot module that is a prototype of a physical the physical support used to remotely map the indoor environment; (iii) the Android-based mobile application that remotely controls the system and processes the acquired data through three different views (i.e. 2D, 3D, and virtual reality); (iv) the Google Cardboard that is an inexpensive viewer used to explore the environment through the virtual reality (VR) view [11]-[15]. The novelties proposed by our (hardware and software) solution are: (i) the design and creation of a miniaturized, independent module that can be integrated in any kind of robot module (e.g. the one here introduced) or that can be worn by the user because of its compactness and lightness. In particular, the size of the independent module is 6x9 cm and it weighs 100 grams, battery included; (ii) the design and development of an Android-based application that allows user to remotely control the independent module integrated in a robot module receiving the directions inputs through two kinds of remote controllers provided in the Android-based application: a software remote controller composed of four buttons that represent the possible directions of the robot module; an accelerometer-based controller exploiting the information provided by the embedded accelerometer sensor to fluently move the robot. These two controllers make it possible to easily integrate our independent module solution in other advanced ready-to-use robot modules like [25]; (iii) the visualization of the data acquired during the mapping through the VR view and Google Cardboard, allowing the end-user to explore the current indoor environment in an immersive way. Even though our solution is based on the typical combination of an ultrasonic sensor with the Arduino platform, we provide an important improvement in the hardware and software design of the system. As a matter of fact, our solution provides a compact, low-cost and versatile hardware device that allows A. Tedeschi*, IEEE, Student Member, S. Calcaterra, F. Benedetto, IEEE, Senior Member Ultrasonic RAdar System (URAS): Arduino and Virtual Reality for a light-free mapping of indoor environments I

Upload: truongdat

Post on 11-May-2018

248 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

1

Abstract— Recently, academia and industry have focused on

the creation of new systems for mapping and exploration of

unknown spaces in order to create advanced guide systems

for robots and people affected by disabilities. In particular,

the most common applications are related to the exploration

of unknown and/or dangerous spaces that are not accessible

to people by exploiting the advantages offered by ultrasonic

technology. This work aims at designing a new low-cost

system, namely Ultrasonic Radar System (URAS), to

blindly map environments by using ultrasonic sensors, then

displaying the acquired information through an Android-

based device.

Index Terms— Arduino, Android, Environmental

Mapping, Internet of Things, Ultrasonic Radar System,

Virtual Reality.

I. INTRODUCTION

NTERNET of Things (IoT) usually refers to a set of technologies (e.g. sensors, tags, mobile devices, and

communication technologies) to design and create advanced and complex systems that aim at improving the quality of life [1]-[2]. Through the paradigms and the standardization of IoT [3], it is possible to develop ad-hoc hardware systems where physical components communicate and cooperate, in order to reach a common goal [4]. Recently, new wearable devices, incorporating advanced electronic technologies, have been developed under the framework of IoT, to provide practical functions and features to aid people in their everyday life. In particular, several solutions have been developed by Academia to improve the quality of life of blind and visually impaired people. For instance, Rey et al. in [5] propose a wearable device based on both a cheap ultrasonic sensor for the detection of obstacles and a vibrotactile feedbacks to warn the end-user about the detected obstacles.

Several works design novel IoT solutions based on ultrasonic technology, serving several purposes such as: the detection of obstacles; the tracking of human and robot’s behavior in an environment; and the mapping of environments. More recently, several applications based on ultrasonic technology have been developed to improve the mapping and localization of a robot module for the exploration of unknown and/or dangerous spaces that cannot be accessed by humans. As a matter of fact, ultrasonic sensors have become a popular measurement tool because of both their simplicity and affordability. A great deal of research efforts have focused on developing systems that can be remotely controlled to explore an environment. For instance, Correra et al. in [6] design an autonomous navigation system based on two parts: a navigation system based on the Microsoft Kinect device [7]; an artificial neural network (ANN) deployed on a laptop integrated in the navigation system to recognize different configurations of the environment. Similarly, Bergeon et al. in [8] develop a 3D mapping system by creating a robot

module based on the Kinect device and a 2D laser sensor to increase the accuracy of the system. Even though these systems provide interesting solutions to map environments, they require a Kinect sensor and a laptop in order to process the acquired information. For these reasons, several works focus on compact sensors and microcontrollers to create ad-hoc systems for mapping and/or localization purposes. For instance, Papa et al. in [9] propose a procedure to create a 3D virtual map by using a sonar and allowing an Unmanned Aerial Vehicle (UAV) system to land safely.

This paper aims at providing a new low-cost, versatile system based on ultrasonic technology and the Arduino Nano platform [10] to blindly map indoor environments with the aid of an Android-based mobile device. In particular, the proposed system, namely Ultrasonic Radar System (URAS), is composed of four components: (i) the independent module that is the core of the URAS and based on an ultrasonic sensor connected to an Arduino Nano that forwards the acquired data to the mobile application; (ii) the robot module that is a prototype of a physical the physical support used to remotely map the indoor environment; (iii) the Android-based mobile application that remotely controls the system and processes the acquired data through three different views (i.e. 2D, 3D, and virtual reality); (iv) the Google Cardboard that is an inexpensive viewer used to explore the environment through the virtual reality (VR) view [11]-[15]. The novelties proposed by our (hardware and software) solution are: (i) the design and creation of a miniaturized, independent module that can be integrated in any kind of robot module (e.g. the one here introduced) or that can be worn by the user because of its compactness and lightness. In particular, the size of the independent module is 6x9 cm and it weighs 100 grams, battery included; (ii) the design and development of an Android-based application that allows user to remotely control the independent module integrated in a robot module receiving the directions inputs through two kinds of remote controllers provided in the Android-based application: a software remote controller composed of four buttons that represent the possible directions of the robot module; an accelerometer-based controller exploiting the information provided by the embedded accelerometer sensor to fluently move the robot. These two controllers make it possible to easily integrate our independent module solution in other advanced ready-to-use robot modules like [25]; (iii) the visualization of the data acquired during the mapping through the VR view and Google Cardboard, allowing the end-user to explore the current indoor environment in an immersive way. Even though our solution is based on the typical combination of an ultrasonic sensor with the Arduino platform, we provide an important improvement in the hardware and software design of the system. As a matter of fact, our solution provides a compact, low-cost and versatile hardware device that allows

A. Tedeschi*, IEEE, Student Member, S. Calcaterra, F. Benedetto, IEEE, Senior Member

Ultrasonic RAdar System (URAS): Arduino and Virtual Reality for a light-free mapping of indoor

environments

I

Page 2: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

2

users to map and explore the environment remotely in real-time by using different kinds of visualizations (i.e. 2D, 3D and VR) provided by the Android application. In addition, the modularity of our solution allows to easily re-use the independent module and the mobile application with advanced, ready-to-use robot modules.

The remainder of this paper is organized as follows. In Section II, we provide a brief analysis of the main state-of-art solutions. Section III depicts the basic frameworks of object detection by ultrasonic sensors. In Section IV, we discuss the rationale of our URAS, as well as the design behind the related Android-based application. Lastly, we discuss the results of some testing campaigns for our system in Section V, while in Section VI our conclusions are briefly presented.

II. RELATED WORKS

Bharambe et al. in [16] design a device composed of two ultrasonic sensors connected to a microcontroller that has to be grasped by the end-user. Through this device, the end-user can detect possible obstacles, then receiving a feedback through the vibrator motors placed on three fingertips. Gupta et. al in [17] proposed an advanced guide cane that guides the user in both outdoor and indoor environments thanks to a global positioning system (GPS) and an obstacle detector system based on an ultrasonic sensors. Sunehra et al. in [18] design a mobile robot system for remote sensing and monitoring with the ability to avoid obstacles. The system is controlled remotely by an end-user via a base station composed of a laptop and connected to the Internet in order to communicate with the robot module. The robot is composed of different sensors and devices such as a camera’s smartphone and an ultrasonic sensor to detect a possible obstacle. Chmelar et al. in [19] introduce a platform composed of a RGB camera and ultrasonic and laser sensors combining the measurement provided by such sensors to map the whole environment. The final output is a 3D vector map of the analyzed environment that has to be created via laptop. However, the proposed system is expensive due to the presence of the laser sensor and the RGB camera. Ismail et al. in [20] propose a mobile robot to create a map of an unknown environment while simultaneously locating the platform itself in the environment. To reach this goal, they design an algorithm that extracts features from raw sonar data received from the mobile robot. The robot is composed of a set of sonar sensors connected to an Arduino UNO platform and sends the acquired data to a laptop in order to process the data and to recreate the environment. The growing popularity of mobile devices has changed the point of view of researchers, who started exploiting this emerging technology providing innovative solutions. As a matter of fact, to improve the mapping of real environments using compact systems, Lim et al. have developed a mobile robot that exploits a smartphone wirelessly connected to an embedded Arduino platform [21]. Exploiting the data received by the ultrasonic sensor, the smartphone computes the current location of the robot by applying the Sequential Monte Carlo approach [22]. Balaji et al. in [23] propose a mobile robot module remotely controlled via an Android-based smartphone through a bluetooth connection. In order to control the mobile robot module, a basic mobile application is developed to enable the bluetooth connection and to send the direction input

obtained through the smartphone’s accelerometer sensor. However, the system does not provide a visual output of the mapped object. Jing et al. in [24] introduce their car unit for search missions remotely controllable via an Android-based device. The system, namely AndroidRC, is equipped with an ultrasonic sensor, a RGB camera, a Bluetooth receiver, a Wi-Fi transmitter, and two Arduino microcontroller boards. The AndroidRC can be controlled by the user through the device’s accelerometer by sending the acquired direction via the Wi-Fi communication with the car unit. However, the system does not allow to create a map of the environment.

III. ULTRASONIC SENSOR FOR OBJECT DETECTION

The most common ultrasonic sensors exploited in several state-of-art approaches are the HC-SR04 [26] and HC-SR05 [27] sensors, which can be used both indoor and outdoor since ultrasonic technology is not sensible to light variations and to low light environments.

This kind of sensors is composed of a transmitter and a receiver enabling the computing of the distance from an object without losing in accuracy, except in presence of objects made of sound-absorbing materials which obviously make the measurements inaccurate. The measurements are performed through a sequence of ultrasonic pulses at the frequency of 40 KHz that can hit objects in the range of 2cm to 400cm [26]-[27]. If an object is in the sensor’s range, the ultrasonic pulses are reflected by the surface of the object and their echoes are then retrieved by the receiver, as shown in Fig. 1.

Fig. 1. Example of the HC-SR05 during the detection of an obstacle detection

in its working range. Exploiting the information about the timing interval between

the sending of the pulse and the reception of its echo, it is possible to compute the distance from the obstacle through the connected microcontroller board, such as Arduino [28]. To enable the communication between the ultrasonic sensor and the microcontroller board, it is necessary to connect the digital pins of the sensor to the board, i.e.: (i) the Trig pin is used to send to the HC-SR05 a pulse of 5 Volt for a duration of at least 10��, (ii) the Echo pin is used to receive a signal that identifies the echoes retrieved by the receiver. Through these pins, the microcontroller manages the sensor and sends the ultrasonic pulses retrieving their echoes.

Fig. 2. Timing diagram of the ultrasonic sensor [29].

Page 3: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

3

As shown in Fig. 2, a trigger pulse of a duration of 10�� is sent from the Arduino platform to the Trig pin of the HC-SR05 sensor. Once the sensor has received the trigger pulse, it sends a sequence of eight ultrasonic pulses at 40 KHz. If there is an object reflecting the pulses, the sensor sends a pulse of 5V with a period of (i.e. width) equal to the detected distance through the Echo pin. Otherwise, if any echo is retrieved after 38��, a null value is sent. Finally, the Arduino platform computes the distance in centimeters from the detected object using (1):

���� �� � � ���

58 �1�

where Time is the width of the echo pulse in ��.

IV. PROPOSED URAS AND DATA VISUALIZATION

The URAS aims at providing a new low-cost, compact solution to blindly map light-free indoor environments and to display the acquired data of the environment to the end-user through different kind of views (i.e. 2D, 3D, and VR). In particular, we have decided to use an ultrasonic sensor that exploits a 40 KHz sound wave because the advantage of this choice is threefold: first (i), this design solution helps in reducing the possibilities of false echoes, since it is very likely that a 40 KHz wave will come from the actual ultrasonic sensors itself rather than from any other source. Then (ii), since the attenuation is directly related with the choice of the frequency of the wave, at 40KHz attenuation is approximately 0.5 dB/ft, while it increases rapidly increasing the frequency, jumping to 2.7 dB/ft at 150 KHz. Finally (iii), since we aim at providing a new low-cost, versatile ultrasonic system compatible with the Arduino Nano platform and Android-based mobile devices, we decided for a 40 KHz ultrasonic sensor, that is the most popular choice for prototyping. In fact, it is easy to use with an Arduino and compatible with Raspberry Pi, its scanning range is pretty good, with stable measurements and high precision for an inexpensive ultrasonic sensor. In addition, the hardware and software choices are well-sustained by the features and advantages provided by Arduino Nano and Android platforms. In particular, we select Arduino instead of other microcontroller platforms like Parallax Basic Stamp [29] because of its features, such as: the cross-platform Integrated Development Environment (IDE), namely Arduino Software IDE [30], based on Processing [31]; an open-source and extensible software and hardware continuously updated, ensuring high level of components compatibility. In fact, Arduino allows to create ad-hoc hardware modules, based on different sensors connected to the platform, and to manage their behaviors through ad-hoc programs written in Wiring programming language [32], deploying them on the platform. In addition, the integration of a bluetooth component allows Arduino to forward the data to a third-party application which aims to process and/or show that data to the end-user. We choose Android as target mobile platform to develop the mobile application because of its worldwide diffusion and open source nature [33]-[34]. Through the Android SDK, we defined all the methods and mechanisms to develop a mobile application that allows users to: (i) manage the communication between the smartphone and the independent module; (ii) display the data acquired through the module; (iii) and remotely control the position of the URAS. In

the following sections, we present the main structural choices behind the design and development of the proposed system.

Fig. 3. The schema of the proposed URAS composed of a) an independent

module and b) a robot module.

A. Design of the Ultrasonic Radar System

The hardware design of the proposed URAS is based on the Arduino platform in order to exploit its open-source nature, creating a system able to map surrounding environments by exploiting cheap components without any loss in accuracy. We design the hardware of our URAS by defining two modules: (i) an independent module, which is the core of our URAS and provides the hardware components to map an environment (see Fig. 3a); (ii) a robot module, which is a physical support for the independent module and composed of two servo-motors remotely controlled by an Android-based smartphone (see Fig. 3b). In particular, we propose here a simple prototype of a robot module to test the behavior of the independent module of the URAS and the mobile application.

Fig. 4. The circuit diagram of the independent module of the URAS.

As shown in Fig. 4, the independent module is composed of

the following hardware components: (i) Arduino Nano version

3.0 is the main component of the system, because it allows managing the independent module [30]. It can be connected to compatible modules through 14 digital input/output pins and 8 analog inputs and it can be supplied by an external battery allowing to be integrated in a mobile unit. Finally, the compact size of the Arduino Nano (i.e. 40x15 mm) and the possibility to programmatically manage its behavior and that of the connected components makes if they fit for our aim; (ii) the HC-05 Bluetooth Module provides the communication between the remote device and controller (i.e. the Android-based application) and the independent module. It is based on CSR Bluetooth chip BC417143 supporting the Bluetooth protocol version 2.0 [35]. The operating frequency is an ISM band of 2.4 GHz exploiting an encrypted Gaussian minimum shift keying (GMSK) modulation to communicate with an authenticated device. Finally, this module supports several baud-rates for

Page 4: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

4

communication and, for the URAS, we selected the 57600 baud-rate; (iii) the HC-SR05 ultrasonic sensor is used to gather the information about the distance from a detected object; (iv) the connection board aims at efficiently connecting each hardware component to the Arduino board supplying the required power; (v) a toggle switch button is connected to the Arduino Nano platform to switch it on/off; (vi) a 9V battery to supply power to the whole system; (vii) a plastic case contains all the above mentioned components and is modified to provide an integration with the robot module or a mobile device using a holder. These choices are well-sustained by the necessity to create a compact and light module wearable by the end-user or easy to integrate in any kind of mobile and static robot modules.

Here, in order to test the behavior of the independent module when it is integrated in a robot module, we developed a simple prototype of a static robot module. The behavior of the robot is extremely simple because it allows the independent module to be easily integrated with any other kind of robot module. As a matter of fact, through our approach the independent module becomes the core of the robot providing the direction input received remotely from the mobile application. This complementary robot is composed of: (i) two micro-servo Tower Pro SG90 [36] that act as servomotors and manage the position of the robot module with a horizontal and vertical rotation angle of 180 degrees at most; (ii) a plastic case that contains a 7.2V NiMH rechargeable battery with 1050mAh that replaces the 9V battery of the independent module when it is connected to the robot and is necessary to manage the independent module’s features and the two servomotors; (iii) two Plexiglas pieces, 4 mm thick, are used as the basic structure to locate the two servomotors and the independent module; (iv) a plastic container and a tube are used to hold the battery and the wires and to support the Plexiglas pieces. The horizontal servomotor is located on the back of the Plexiglas structure, while the vertical servomotor is located on the plastic pipe and connected to a helical spring to handle vertical positions and the stability of the whole robot. To connect the independent module to the robot unit, we exploited the connection pins provided by the robot unit. As shown in Fig. 5, the right pins are used to supply power to the independent module, while the left pins connect the independent module with the two micro-servo in order to supply power and to forward the information about the rotation angle to the micro-servos.

Fig. 6 shows the final version of URAS composed of the independent module and the robot module.

In Tab. 1, we report the price (in euros) of each hardware component and the final price of the proposed URAS,

Fig. 5. Physical interface configuration of the independent module, where s is the signal transmitted from the independent module to the micro-servos and

the symbol “+” and “-” represent the supplied power.

Fig. 6. The final version of the URAS using a) the independent module or b)

the robot module and the independent module.

underlining the cheapness of our system which can be easily replicated using less than 53€. Moreover, it is possible to decrease the price of this solution by printing the plastic components with a 3D printer using our project files released on thingverse.com [37].

To enable the remote mapping of the surrounding environment, the independent module has to be connected to the robot module in order to update its position through the direction inputs sent from the Android-based application via bluetooth. To manage this behavior, and the mapping as well, we develop an ad-hoc sketch program that allows the module to interact with the other hardware components. In details, the sketch program uses two libraries, the SoftwareSerial and Servo, written in C programming language and used to manage the bluetooth and servomotors, respectively. The sketch is then deployed on the Arduino board.

TABLE I

PRICE (IN EUROS) OF EACH ELEMENT USED TO CREATE THE URAS.

Component Price

Arduino Nano v3.0 14.99 €

HC-05 Bluetooth Module 2.99

HC-SR05 9.99 €

Micro-servos 7 €

Batteries 10 €

Other minor hardware components 5 €

Plastic containers and pipe 3 €

Total 52.97 €

Fig. 7.a provides a simplified flow chart of the mapping process performed by the URAS. Once the URAS is activated, an initial setup of each element is required to start mapping the environment. In particular, it is necessary to make the URAS reachable by the mobile device through the bluetooth communication channel and, if the independent module is connected to the robot, to set the position of the two servomotors at 90 degrees (i.e. the standard position). Once the setup is completed, the URAS is ready to map the surrounding environment through the HC-SR05 sensor that forwards the acquired data to the Arduino platform in order to compute the distance from detected objects. In particular, to provide accurate distance information, we provide a simple approach based on the mean of three consecutive and not null measurements. At each acquisition loop performed by the independent module, the computed measurement is temporary saved in an array of values that represent the acquired measurement.

Once the module has collected three consecutive and not null values, it computes the mean of these values and sends the result

Page 5: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

5

Fig. 7. Simplified flow-chart of the: (a) mapping process performed by the URAS, (b) data acquisition process performed by the mobile application

to the mobile device via Bluetooth. Then, the platform exploits the bluetooth sensor to send the computed data to the Android-based application.If the independent module is connected to the robot, then the algorithm checks if the remote control is enabled and assesses the presence of new direction vectors to update the current position of the robot. To do so, the independent module checks if the received direction vectors are corrupted due to the communication channel condition, avoiding unintentional movements or misbehaviors of the robot module. In particular, the system computes the absolute value of the difference between the new and current angles of the micro-servo and compares the results with an empirical, pre-defined threshold, which is here fixed to 100 degrees. The threshold is used in order to avoid wrong movements of the robot module due to: unusual values of rotation angle caused by the bluetooth channel’s condition; sudden swerving movements that can influence on the robot stability, a necessary feature of robots. If the computed value is lower than the considered threshold, then the position of the micro-servo is updated. In order to move the robot unit, we defined an ad-hoc method, namely moveServo, which has the responsibility to modify the position of a micro-servo according to the received direction vector. In particular, this method gets two integer values as input parameters, the first integer represents the micro-servo that should be moved (i.e. the vertical or the horizontal micro-servo), while the second value is the rotation angle that should be in the range of 0-180 degrees. If the rotation angle is not in this range, then the micro-servo is not moved, otherwise the write method of Servo library is invoked to update the position of the micro-servo.

The position update of the URAS is performed through two modalities. In the first one, the end-user can exploit the accelerometer sensor embedded in the mobile device to update the position of the robot module, generating the following vector.

� ������; � �_���; ��� �_��� ! �2�

where the modality bit describes which approach is used by the end-user to update the position of the robot, while the other two values are the current values of pitch and azimuth angles provided by the accelerometer sensor. In this scenario, the modality bit is equal to 0. Once the mobile device transmits the data to the independent module, it validates the input to avoid corrupted data due to the transmission. Then, if the received direction vector is valid, the Arduino updates the position of servomotors using the provided values of pitch and azimuth angles. The second modality is based on a software remote controller based on four virtual buttons displayed on the mobile application and representing the possible directions of the two servomotors (i.e. up, down, left, right). This approach generates different kinds of direction vector composed by three bits:

� ������; ��#�����#; �#� ��; ! �3�

where the modality bit is set to 1, the servomotor bit indicates which servomotor should be updated and in which direction by reading the direction bit. For instance, according to the Tab. 2, if the independent module receives the vector � 1; 1; 0 !, it moves the servomotor down by 10 degrees. This value is set via software and can be programmatically replaced providing a high level of accuracy in the mapping of the environment. Finally, the mapping process is executed until the end-user stops it through the application.

TABLE II CODING COMMANDS OF THE SOFTWARE REMOTE CONTROLLER

Bit Servomotor Direction

Horizontal Vertical 0 Horizontal

Servomotor down left

1 Vertical Servomotor

up right

B. Software Design and Ultrasonic Data Visualization by

Android and OpenGL

To remotely manage the behavior of the system and visualize the acquired data in real-time, we designed and developed an ad-hoc mobile application for Android-based devices.

The architecture of our application relies on the principles of the well-known design pattern Layers [38]-[39]. Following the guideline of this pattern, we define four layers, each one managing a specific application behavior.

The four layers are: (i) the graphics model layer, which is used to model and manage the graphic elements that represent the distance information acquired through the URAS. Such package is composed of two sub-layers. The basic rendering sub-layer provides methods for the rendering of the elements in bi-dimensional (2D) or tri-dimensional (3D) visualizations, while the virtual reality rendering sub-layer defines the functions to manage the VR, defining a binocular stereo vision through the Application Programming Interfaces (APIs) of the Google Cardboard [40]. In addition, the definition of such layers is based on the OpenGL ES library [41]; (ii) the I/O

communication layer supplies the methods to manage the communication between the mobile device and the independent module through a communication channel based on the bluetooth protocol; (iii) the business logic layer is the core of the application and allows to initialize the communication with

Page 6: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

6

the URAS and to generate the direction vector to update the position of the robot module. In addition, this layer manages the distance information obtained through the URAS and the creation of the object to be drawn; (iv) the UI layer allows the end-user to remotely control the robot module and to display the data acquired by URAS in real-time choosing among the 2D, 3D, and VR views. This organization allows us to properly manage the tasks required for the mobile application, which are summarized in the flow chart provided in Fig. 7.b. Once the application is running, the end-user enables the communication between the device and the URAS through the definition of a bluetooth communication channel that is created exploiting the I/O communication layer methods. The bluetooth communication is based on the bluetooth API provided by the Android platform [42] and it requires to invoke an ad-hoc method that will listen to the communication channel and intercept the flow of incoming data. Through such a channel, the application is able to receive the acquired data and to remotely control the hardware units by sending direction commands generating the direction vectors (see sub-section IV.A) to the independent module. Then, the end-user can select how to remotely control the system through the application (i.e. the device’s accelerometer or the software remote control) and how to visualize the data in real-time by choosing among the three views. Once the communication channel is established, the application is ready to receive and process the data acquired by the URAS. The bluetooth packets received by the device contain the measured distances to the (eventually) detected objects. Such values are forwarded to the business logic layer which prepares them to be rendered through the graphic model layer methods. To update the graphical visualization of the received data, we exploit the OpenGL ES 2.0 library that provides a set of cross-platform APIs to process elements of 3D graphics. Through this library, we define the methods and elements (i.e. points, lines, triangular and rectangular shapes, and spheres) exploited for the three different types of data visualization. In particular, we create an entity, namely Renderer, which aims to dynamically update the current view in order to reproduce a simplified version of the mapped environment. In particular, if the data are visualized through the 2D view, a radar map is shown to the end-user and it is composed of two main graphic elements: a set of 40 concentric circles that represent the distance of 10 cm from the detected objects; triangular shapes represent the measured distance from detected objects. This view allows the end-user to map the environment by using the accelerometer sensor or the software remote controller.

In the 3D and VR views, the camera’s point of view is set at the center of a sphere that has a radius equal to the maximum range of the ultrasonic sensor (i.e. 400 cm), as shown in Fig. 8. The 3D view is characterized by rectangle shapes, differing in color, which represent the detected objects and the current distance between them and the HC-SR05 sensor.

Through this data view, the end-user can dynamically change his/her point of view, exploring the mapped environment by using the accelerometer sensor or the software remote controller. If the VR view is selected, the end-user has to wear the Google Cardboard, a cheap cardboard box that allows to exploit the binocular stereo vision [43] and to control the hardware units through the accelerometer.

Fig. 8. Camera point of view for the 3D and VR views.

Even though this view adopts the same point of view and

objects designed for the 3D view, it requires a different definition of the virtual environment that has to be based on the Google Cardboard APIs [42] in order to properly manage the data visualization and the changing in the camera point of view. Through this view, the end-user can remotely explore the surrounding environment in an immersive way. In addition, thanks to the lightness of the independent module, the end-user can wear it by attaching it to the back of the mobile device.In this way, the end-user has a greater freedom of movement than the proposed static robot module and he/she can explore the whole virtual environment by changing the point of view of the camera via the accelerometer sensor. Finally, if the remote control is active and the independent module is connected to the robot module, the user can provide the direction inputs as follows.

When the end-user is using the accelerometer sensor, he/she creates a direction input through the current values of azimuth and pitch angles that are embedded in a direction vector like the one in (3) and put into the buffer of the bluetooth connection. To provide this kind of remote control, we implement the SensorEventListener interface of the Android API and enable the application to acquire the data from the accelerometer [44]. In addition, these values and the roll angle are also forwarded to the Renderer that updates the current visualization of the end-user. If the end-user has chosen the software remote controller, he/she can use the four direction buttons to encode a new position for the robot module and, hence, to map and display a new section of the environment because of the new point of view of the robot. In particular, this controller generates the direction vector defined in (4) and allows the independent module to update its position of 10 degrees for each received direction. It can be assessed that this kind of controller limits the accuracy of the mapping and it can be used only with the 2D and 3D views, since it requires a direct input from the end-user who has to press the direction buttons displayed on the screen. To improve the efficiency of the application, we consider the multithreading programming approach that allows the end-user to provide the direction inputs putting them in the queue of the bluetooth socket while the application is receiving the acquired data via bluetooth. Then, the direction inputs are sent to the independent module, updating its own position if the robot module is connected. Finally, the discussed operations are executed until the end-user stops the mapping and the visualization process, and then closes the communication with the independent module.

V. EXPERIMENTAL RESULTS

In this section, we report an overview of the developed mobile application and the experimental results carried out to

Page 7: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

7

Fig. 9. Mockup of the navigation page of the proposed URAS mobile

application showing the real pages of the application.

evaluate the performance of the URAS located in a room and in presence of several objects made by different materials and having different shapes and sizes.

A. Application overview

Figs. 9-10 show an overview of the mobile application used to visualize the acquired data and to remotely control the URAS. The main page of the application, namely MainActivity, allows the end user to perform several operations, such as the setting of the bluetooth connection and the selection of a data visualization view. By clicking on the bluetooth icon, the end-user is redirected to the DeviceListActivity page where he/she can configure the connection between the device and the independent module, from a list of bluetooth devices. Once the independent module is selected, the application automatically sets the MAC address of the device and requires the end-user to insert the password. Otherwise, if the end-user already knows the MAC address of the independent module, he/she can manually set it through the SettingsActivity page, shortening the time required to pair the two devices. In the MainActivity page, the user can choose the remote control mode of the URAS by clicking on the device’s buttons, switching from the controller based on the accelerometer sensor to the software remote controller. Finally, to visualize the data acquired by the independent module, the user can click on one of the three vertical buttons which represent the 2D, 3D and VR views, respectively. Through these buttons, the end-user is redirected to a different application page to visualize the mapping process in real-time. When the end-user selects the 2D view, the application displays a radar map composed of 40 concentric circles and triangular shapes that represent the current distance from detected objects. In addition, each gray circle represents a distance of 10 cm from objects, while the white ones represent a distance of 1 m from the detected objects (see Fig. 10.a and 10.b). Finally, in the upper left corner the distance from the last detected object is reported. If the 3D view is selected, the application displays a 3D environment composed of rectangle shapes where the color of a rectangle represents the distance

from the hardware system. In particular, if the object is near the module, the rectangle is white. Otherwise, it becomes red as the distance from the detected object increases. The whole virtual environment is composed by a sphere which allows to map an environment completely and to explore its virtual representation (see Fig. 10.c and 10.d). In the upper left corner the distance from the last detected object is reported. Finally, when the VR view is selected and the end-user is wearing the Google Cardboard (see Fig. 10.e), he/she can explore the virtual environment by exploiting the accelerometer sensor to change the point of view of the camera and, hence, the position of the independent module connected to the robot module. In Fig. 10.f, it is possible to see the virtual environment displayed to the end-user composed of colored rectangle shapes that represent the detected objects at different distances from the module. In the center, at the top, the distance from the last detected object is reported. It is important to underline that the VR view splits the device’s screen in two parts, displaying the same objects that, thanks to the Google Cardboard and the principle of the binocular stereo vision, result in a unique virtual environment.

B. Types of Graphics

We evaluated the performance of our URAS using several objects made with different materials, having different sizes and located at a different distance in the room. The tests were conducted in a controlled environment and aimed at evaluating the accuracy of the measurements performed through the URAS by computing the mean, the variance and the standard deviation of the estimated distance. In the first battery of tests, we set a distance of 40 cm between the URAS and the considered objects reporting the mean distance, the variance and the standard deviation measured by the sensor and obtained through 100 trials, where each trial is composed of three consecutive measurements.

TABLE III MEASUREMENT ACCURACY TESTS OF THE URAS IN PRESENCE OF

OBJECTS MADE OF DIFFERENT MATERIALS.

Object

Estimated

Distance

Mean

Estimated

Distance

Variance

Estimated Distance

Standard Deviation

Action figure 41,99 0,050404 0,224508442 Cardboard box 40,61 0,280707 0,52981796 Glass 38,74 0,194343 0,440844002 Mirror 40,58 0,246061 0,496044964 Plastic sphere 40,67 0,263737 0,513553672 Plastic storage box 39,96 0,139798 0,37389568 Sound absorbing panel 1892,54 3231569 1797,656642

The results in Table III confirm the expectations and the limits of the HC-SR05 in presence of a sound absorbing panel since the considered metrics result in expected very unusual values. As for the other objects, even if each of them is characterized by a different shape and material, the mean values are close to the real distance and both the variance and the standard deviation values are low enough to assess the reliability of the ultrasonic sensor and, hence, of the URAS. Then, we assessed the robustness of URAS in presence of objects placed at different angles using, for example, the cardboard box located at 40 cm and at different angles.

Page 8: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

8

Fig. 10. The 2D view (i.e. the radar map) to display the distance from the detected object (a) using the device sensors, (b) using the software control; the 3D view

composed of rectangular shapes that assess the presence of many objects in the considered environment (c) using the device sensors, (d) using the software control.; (e) the end-user wearing the Google Cardboard and (f) the VR view composed of rectangular shapes that underlines the presence of several objects.

TABLE IV

MEASUREMENT ACCURACY TESTS OF THE URAS IN PRESENCE OF A CARDBOARD BOX PLACED AT DIFFERENT ANGLES.

Angle Estimated

Distance Mean

Estimated Distance

Variance

Estimated Distance

Standard Deviation

30° 42,55 7,4246 2,7248 45° 41,36 0,5761 0,759 60° 43,65 0,4520 0,6723 90° 40,36 0.2529 0.50292

TABLE V MEASUREMENT ACCURACY TESTS OF THE URAS IN PRESENCE OF

A CARDBOARD BOX LOCATED AT SEVERAL DISTANCES

Real

Distance

(cm)

Estimated

Distance

Mean

Estimated

Distance

Variance

Estimated Distance

Standard Deviation

40 39,64 0,2304 0,482418 80 78,98 0,7596 0,875941 120 120,92 1,3136 1,151898 160 161,4141 1,7219 1,318823 200 200,3333 1,5011 1,231366 240 208,1414 0,9075 0,957427 280 206,7475 1,2675 1,131505 320 206,46 0,7084 0,845905 360 206,62 1,0956 1,051982 400 206,79 1,2459 1,121822

The results presented in Table IV highlight that the ultrasonic sensor is not able to properly estimate the distance from the cardboard box. In particular, when the cardboard box is rotated at 30 degrees, the sensor makes several mistakes in computing the distance, as underlined not only by the mean distance but also by the values of variance and standard deviations. A similar behavior can be noticed when the cardboard is rotated at 60 degrees where the estimated mean value is 43.36 cm. Finally, we test the capability of the URAS by increasing the distance between the system and the small cardboard box, considering that the box is not at a different angle from the point of view of the sensor. The results in Table V confirm, again, the expectation and the ability of the URAS to correctly compute the distance from the considered objects in a range of 200 cm. If this value is exceeded, the system is obviously not more able to properly estimate the distance, providing a value around 206 cm. This happens because as the distance between the sensor and the object increases, the size of the cone field of view increases too, and the ultrasonic sensor is only able to detect

objects of a considerable size like, for instance, a wall at 400 cm.

VI. CONCLUSION

It is possible to assess that an extensive work has been done in the creation of novel solutions based on ultrasonic technologies for the exploration and mapping of new environments. However, most of these works require additional and expensive sensors, such as a RGB camera or a laptop, and the core of the system is typically embedded in a mobile robot module or in some objects, such as gloves or hats, making it impossible to directly reuse it in other solutions. In addition, none of these solutions provide a compact, cheap and versatile system that can be worn by an end-user or integrated in a robot module. Finally, some of the considered systems provide a visualization of the acquired data based on the image acquired by the camera embedded in the system or through 3D rendering, requiring additional computational resources provided, for instance, by a laptop. Conversely, we overcome all these drawbacks through the definition of the URAS, a novel system for a light-free mapping of indoor environments.

In this work, we proposed a novel low-cost, compact ultrasonic radar system for a light-free mapping of indoor environments. In particular, the proposed hardware and software design of the system allowed us to create a small hardware device that can be worn by a user or be integrated in a ready-to-use robot module and to manage the I/O communication via an Android-based device. Our solution is composed of four main elements: (i) the independent module, which is the core of the system and is designed in order to be cheap, compact and light. As a matter of fact, the size of this module is 6x9 cm, its weight is 100 grams; (ii) a prototype of a robot module, used to test the behavior of the system when it is controlled remotely; (iii) an Android-based application, which allows users to display and explore the mapped environment through different types of visualizations (i.e. 2D, 3D and VR) and to control the system remotely; (iv) the Google Cardboard, which allows users to use the VR view. In particular, through the VR view and the Google Cardboard, the end-user can explore the virtual environment, obtaining accurate information about the location and distance of each detected object.

Page 9: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

9

The URAS was designed and developed to create a modular structure that allows to extend the behavior of the system adding new sensors to the hardware and new features to the mobile application. In addition, the whole system is based on cheap hardware components, enabling the creation of a versatile IoT device that can be easily replicated with less than 53€. Finally, through different sets of measurement-accuracy tests, we have proved that the URAS ensures a correct mapping of environments by using the simple but efficient approach described in section IV.A. Finally, the proposed solution allows the user to explore the environment mapped through the independent module in real-time, easily obtaining useful information.

REFERENCES [1] K. Ashton, “Internet of things”. RFID J. 2009 [Online]. Available:

https://goo.gl/mg77P2 [2] L. D. Xu, W. He, S. Li, "Internet of Things in Industries: A Survey," in

IEEE Trans. on Ind. Informatics, vol. 10, no. 4, pp. 2233-2243, 2014. [3] D. Bandyopadhyay, J. Sen, "Internet of things: Applications and

challenges in technology and standardization", Wireless Personal Communications, vol. 58, no. 1, 49–69, 2011.

[4] R. van Kranenburg, E. Anzelmo, A. Bassi, D. Caprio, S. Dodson, M. Ratto, “The internet of things,” in Proc. 1st Symp. Internet Soc., Berlin, Germany, 2011, pp. 25–27.

[5] M. Rey, I. Hertzog, N. Kagami, L. Nedel, "Blind Guardian: A Sonar-Based Solution for Avoiding Collisions with the Real World," XVII Symp. on Virtual and Augmented Reality, 2015, pp. 237-244.

[6] D. S. O. Correa, D. F. Sciotti, M. G. Prado, D. O. Sales, D. F. Wolf, F. S. Osorio, "Mobile Robots Navigation in Indoor Environments Using Kinect Sensor", 2nd Brazilian Conference on CBSEC, 2012, pp. 36-41.

[7] Kinect for Xobox One. [Online]. Available on: https://goo.gl/fgyWHj. Last access: 04/08/2016

[8] Y. Bergeon, I. Hadda, V. Křivánek, J. Motsch , A. Štefek, "Low cost 3D mapping for indoor navigation," Int. Conf. on Mil. Tech., 2015, pp. 1-5.

[9] U. Papa, G. Del Core, "Design of sonar sensor model for safe landing of an UAV," IEEE Metrology for Aerospace, 2015, pp. 346-350.

[10] Arduino Nano. [Online]. Available on: https://goo.gl/NxBUbP. Last access: 04/08/2016

[11] Oculus. [Online]. Available on: https://www.oculus.com/. Last access: 04/08/2016

[12] The VOID. [Online]. Available on: https://thevoid.com/. Last access: 04/08/2016

[13] H-L Chi, S-C Kang, X. Wang, "Research trends and opportunities of augmented reality applications in architecture, engineering, and construction", Automation in Construction, vol. 33, 2013, pp. 116-122.

[14] M. Wairagkar, I. Zoulias, V. Oguntosin, Y. Hayashi and S. Nasuto, "Movement intention based Brain Computer Interface for Virtual Reality and Soft Robotics rehabilitation using novel autocorrelation analysis of EEG," 6th IEEE Int. Conf. on BioRob, Singapore, 2016, pp. 685-685.

[15] A. Lele, "Virtual reality and its military utility", Journal of Ambient Intelligence and Humanized Computing, vol. 4, no. 1, pp. 17–26, 2013.

[16] S. Bharambe, R. Thakker, H. Patil, K. M. Bhurchandi, "Substitute Eyes for Blind with Navigator Using Android," India Educators' Conference (TIIEC), Texas Instruments, Bangalore, 2013, pp. 38-43.

[17] S. Gupta, I. Sharma, A. Tiwari, G. Chitranshi, "Advanced guide cane for the visually impaired people", Int. Conf. on NGCT, 2015, pp. 452-455.

[18] D. Sunehra, A. Bano, S. Yandrathi, "Remote monitoring and control of a mobile robot system with obstacle avoidance capability," Int. Conf, on Advances in Computing, Comm. and Informatics, 2015, pp. 1803-1809.

[19] P. Chmelar, M. Dobrovolny, "The fusion of ultrasonic and optical measurement devices for autonomous mapping," 23rd Int. Conf. Radioelektronika, Pardubice, 2013, pp. 292-296.

[20] H. Ismail, B. Balachandran, "Algorithm Fusion for Feature Extraction and Map Construction From SONAR Data," in IEEE Sensors Journal, vol. 15, no. 11, pp. 6460-6471, Nov. 2015.

[21] J. Lim, S. Lee, G. Tewolde, J. Kwon, "Indoor localization and navigation for a mobile robot equipped with rotating ultrasonic sensors using a smartphone as the robot's brain," IEEE Int. Conf. on EIT, 2015,pp.621-625.

[22] I. M. Rekleitis, “A Particle Filter Tutorial for Mobile Robot Localization”, Tech. Pep. Tech. Rep., McGill University, 2004

[23] L. Balaji, G. Nishanthini, A. Dhanalakshmi, "Smart phone accelerometer sensor based wireless robot for physically disabled people," IEEE ICCICR, Coimbatore, 2014, pp. 1-4.

[24] Yuxin Jing, Letian Zhang, I. Arce and A. Farajidavar, "AndroRC: An Android remote control car unit for search missions," IEEE LISAT, Long Island, NY, 2014, pp. 1-5.

[25] Thingiverse, “ArduBot - 3D Printed Arduino robot”. [Online]. Available on: http://www.thingiverse.com/thing:603907

[26] Elect Freaks, “Ultrasonic Ranging Module HC - SR04”. [Online]. Available on: https://goo.gl/MoUxQO. Last access: 04/08/2016

[27] HC-SR05 – datasheet. [Online]. Available on: https://goo.gl/7te5dG. Last access: 04/08/2016

[28] Arduino. [Online]. Available on: https://www.arduino.cc/. Last access: 04/08/2016

[29] Parallax Basic Stamp. [Online]. Available on: https://www.parallax.com/. Last access: 04/08/2016

[30] Arduino Software. [Online]. Available on: https://www.arduino.cc/en/Main/Software. Last access: 04/08/2016

[31] Processing. [Online]. Available on: https://processing.org/. Last access: 04/08/2016.

[32] Wiring. [Online]. Available on: http://wiring.org.co/. Last access: 04/08/2016.

[33] P. Rogers, R. Brien. “Smartphone OS Barometer”, Jan. 2014. [Online]. Available on http://uk.kantar.com/. Last access: 04/08/2016.

[34] A. Tedeschi, A. Liguori, F. Benedetto, “Information Security and Threats in Mobile Appliances”, Recent Patents on Comp. Science, vol. 7, no. 1, pp. 3-11, 2014. DOI: 10.2174/2213275907666140610200010

[35] ITead Studio, “HC-05 - Bluetooth to Serial Port Module”. [Online]. Available on: https://goo.gl/5Cy8u6. Last access: 04/08/2016.

[36] Tower Pro SG90 - Datasheet. [Online]. Available on: http://www.micropik.com/PDF/SG90Servo.pdf

[37] Thingvers, “URAS example robot model”. [Online]. Available on: http://www.thingiverse.com/thing:2095666

[38] E. Gamma, R. Helm, R. Johnson, J. Vlissides, “Design Patterns: Elements of Reusable Object-Oriented Software”, ISBN-10:0201633612, 1994.

[39] C. Larman, “Applying UML and Patterns, An Introduction to Object-Oriented Analysis and Design and Iterative Development”, Prentice Hall. ISBN 0-13-148906-2.

[40] Google VR, “Create new virtual reality experiences in your mobile apps”. [Online]. Available on: https://developers.google.com/vr/. Last access: 04/08/2016.

[41] The Khronos Group Inc, “OpenGL Software Development Kit”. [Online]. Available on: https://www.opengl.org/sdk/. Last access: 04/08/2016.

[42] Android, “Bluetooth”. [Online]. Available on: https://goo.gl/uGMCV3. Last access: 04/08/2016.

[43] T.R. Dahl, G. Spector. "Binocular stereovision." U.S. Patent No. 4,982,278. 1 Jan. 1991.

[44] Android, “Motion Sensors”. [Online]. Available on: https://goo.gl/eiDMgO. Last access: 04/08/2016.

Antonio Tedeschi was born in Atripalda (AV), Italy, on December 1988. He received the Master Degree cum laude in Computer Engineering and the PhD in Applied Electronics from the University of Roma TRE, Rome, Italy in May 2013 and May 2017, respectively. Since January 2014, he is a member of the Signal Processing for Telecommunications and Economics (SP4TE) laboratory of University of Roma Tre focusing his research activity on different fields, such as Software and Cognitive Radio and mobile

application development. From March 2016 until June 2016, he was a Visiting PhD Student at the Tampere University of Technology, Department of Electronics and Communications Engineering. He has held the different roles in International Conferences and Journals, such as Program Committee of the IEEE International Conference FiCloud 2015 and Co-Guest Editor for the special issue "New Applications and Advanced Methodologies for Road Safety and Simulation", IJITN journal. Finally, his passion for IT has led him to learn mobile programming and develop native mobile applications for Android and Xamarin as an independent developer and freelancer for companies. Recently, he started a collaboration as an author with HTML.it, one of the most authoritative Italian websites on IT, to redact a guides and articles about mobile

Page 10: Ultrasonic RAdar System (URAS): Arduino and Virtual ...ieeeprojectsmadurai.com/Emp basepaper 2017-2018/Ultrasonic RAdar... · system, namely Ultrasonic Radar System (URAS), to

1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2017.2708840, IEEESensors Journal

> Sensors-17234-2017.R1: Accepted<

10

programming. Since February 2017, he is working at the R&D section of Immobiliare.it focusing his activity on machine learning approaches for image detection and recognition and anomaly detections.

Stefano Calcaterra was born in Rome, Italy, in 1988. He received the master degree in Computer Engineering from the University of Roma Tre, Rome, Italy, in December 2014. He focused his activity as developer on the development of mobile applications for Android and iOS, including features like Virtual Reality and interactions with the Arduino platform. In 2012, he worked with the Fondazione Ugo Bordoni (FUB, Rome, Italy), developing an Android-based application for the enhancement of cultural heritage. From 2015 to 2016, he works for HP CDS as business applications engineer.

Since January 2017, he is working at ES Field Delivery Italia.

Francesco Benedetto (S’04-A’08-M’10-SM’15) was born in Rome, Italy, in 1977. He received the Dr. Eng. degree in electronic engineering and the Ph.D. degree in telecommunication engineering from the University of Roma Tre, Rome, Italy, in May 2002 and April 2007, respectively. In 2007, he was a research fellow of the Department of Applied Electronics of the Third University of Rome. Since 2008, he has been an Assistant Professor of Telecommunications at the Third University of Rome. In May 2017, He won the National Academic

Qualification as Associate Professor. His research interests are in the field of ground penetrating radar, software and cognitive radio, digital signal and image processing for telecommunications and economics, code acquisition, and synchronization for the 3G mobile communication systems and multimedia communication. Since 2016, He is the Chair of the IEEE 1900.1 working group on "Definitions and Concepts for Dynamic Spectrum Access: Terminology Relating to Emerging Wireless Networks, System Functionality, and Spectrum Management". He is the Leader of the WP 3.5 on “Development of advanced GPR Data Processing technique” of the European COST Action TU1208 - Civil Engineering Applications of Ground Penetrating Radar. He is an Editor of the IEEE SDN Newsletter, the General Chair of the Series of International Workshops on Signal Processing for Secure Communications (SP4SC-2014, 2015, 2016), and the Lead Guest Editor of the Special Issue on "Advanced Ground Penetrating Radar Signal Processing Techniques" for the Signal Processing Journal (Elsevier). Dr. F. Benedetto also served as a reviewer for several IEEE TRANSACTIONS, IET (formerly IEE) proceedings, EURASIP and ELSEVIER journals, and a TPC member for several IEEE international conferences and symposia in the same fields.