smartontosensor: ontology for semantic interpretation of...

27
Research Article SmartOntoSensor: Ontology for Semantic Interpretation of Smartphone Sensors Data for Context-Aware Applications Shaukat Ali, Shah Khusro, Irfan Ullah, Akif Khan, and Inayat Khan Department of Computer Science, University of Peshawar, Peshawar 25120, Pakistan Correspondence should be addressed to Shah Khusro; [email protected] Received 11 April 2016; Revised 10 August 2016; Accepted 11 January 2017; Published 20 February 2017 Academic Editor: Andrea Cusano Copyright © 2017 Shaukat Ali et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. e integration of cheap and powerful sensors in smartphones has enabled the emergence of several context-aware applications and frameworks. However, the available smartphone context-aware frameworks are static because of using relational data models having predefined usage of sensory data. Importantly, the frameworks lack the soſt integration of new data types and relationships that appear with the emergence of new smartphone sensors. Furthermore, sensors generate huge data that intensifies the problem of too much data and not enough knowledge. Smarting of smartphone sensory data is essential for advanced analytical processing, integration, inferencing, and interpretation by context-aware applications. In order to achieve this goal, novel smartphone sensors ontology is required for semantic modeling of smartphones and sensory data, which is the main contribution of this paper. is paper presents SmartOntoSensor, a lightweight mid-level ontology that has been developed using NeOn methodology and Content Ontology Design pattern. e ontology describes smartphone and sensors from different aspects including platforms, deployments, measurement capabilities and properties, observations, data fusion, and context modeling. SmartOntoSensor has been developed using Prot´ eg´ e and evaluated using OntoQA, SPARQL, and experimental study. e ontology is also tested by integrating into ModeChanger application that leverages SmartOntoSensor for automatic changing of smartphone modes according to the varying contexts. We have obtained promising results that advocate for the improved ontological design and applications of SmartOntoSensor. 1. Introduction Smartphones are modern high-end mobile phones com- bining the features of pocket sized communication devices (i.e., mobile phones) with PC like capabilities (i.e., PDAs). Sensory technology was extended to smartphones in order to turn these communication devices into life-centric sen- sors so that their capabilities and functionalities could be increased substantially. To date, smartphones have a rich set of sensors [1], which have increased their capabilities in several ways especially introducing a new class of cooperative services including real-time healthcare, environmental and transportation monitoring, gaming, safety, and social net- working [2, 3]. However, in frame of the smartphone context- aware application development, the integration of large-scale sensory data that contains real-time spatial, temporal, and thematic information, which could be used for decision making in a rich tactical environment, is crucially difficult [3]. Today, most of these context-aware applications use brute- force approach for collecting and analyzing sensory data that wastes valuable energy and computation resources because of generating observations of minimal use. e huge sensory data obtained from smartphone sensors intensifies the problem of too much data but not enough knowledge, which is undesirable to smartphone context- aware applications. e problem results due to several rea- sons. First, the heterogeneous sensors produce voluminous data in varying formats and measurement procedures, which makes it difficult for classical Information Retrieval (IR) techniques to help users in searching and retrieving relevant information. Second, sensors differ in their values as well as description terminologies resulting into terminology mis- match, which makes it difficult for keyword-based searching techniques to retrieve relevant information. ird, sensors inherent design characteristics and lack in adaptability to varying conditions hampers the accuracy and reliability of Hindawi Journal of Sensors Volume 2017, Article ID 8790198, 26 pages https://doi.org/10.1155/2017/8790198

Upload: others

Post on 24-Aug-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Research ArticleSmartOntoSensor: Ontology for Semantic Interpretation ofSmartphone Sensors Data for Context-Aware Applications

Shaukat Ali, Shah Khusro, Irfan Ullah, Akif Khan, and Inayat Khan

Department of Computer Science, University of Peshawar, Peshawar 25120, Pakistan

Correspondence should be addressed to Shah Khusro; [email protected]

Received 11 April 2016; Revised 10 August 2016; Accepted 11 January 2017; Published 20 February 2017

Academic Editor: Andrea Cusano

Copyright © 2017 Shaukat Ali et al. This is an open access article distributed under the Creative Commons Attribution License,which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

The integration of cheap and powerful sensors in smartphones has enabled the emergence of several context-aware applicationsand frameworks. However, the available smartphone context-aware frameworks are static because of using relational data modelshaving predefined usage of sensory data. Importantly, the frameworks lack the soft integration of new data types and relationshipsthat appear with the emergence of new smartphone sensors. Furthermore, sensors generate huge data that intensifies the problemof too much data and not enough knowledge. Smarting of smartphone sensory data is essential for advanced analytical processing,integration, inferencing, and interpretation by context-aware applications. In order to achieve this goal, novel smartphone sensorsontology is required for semantic modeling of smartphones and sensory data, which is the main contribution of this paper.This paper presents SmartOntoSensor, a lightweight mid-level ontology that has been developed using NeOn methodology andContent Ontology Design pattern. The ontology describes smartphone and sensors from different aspects including platforms,deployments, measurement capabilities and properties, observations, data fusion, and context modeling. SmartOntoSensor hasbeen developed using Protege and evaluated using OntoQA, SPARQL, and experimental study. The ontology is also tested byintegrating intoModeChanger application that leverages SmartOntoSensor for automatic changing of smartphonemodes accordingto the varying contexts. We have obtained promising results that advocate for the improved ontological design and applications ofSmartOntoSensor.

1. Introduction

Smartphones are modern high-end mobile phones com-bining the features of pocket sized communication devices(i.e., mobile phones) with PC like capabilities (i.e., PDAs).Sensory technology was extended to smartphones in orderto turn these communication devices into life-centric sen-sors so that their capabilities and functionalities could beincreased substantially. To date, smartphones have a richset of sensors [1], which have increased their capabilities inseveral ways especially introducing a new class of cooperativeservices including real-time healthcare, environmental andtransportation monitoring, gaming, safety, and social net-working [2, 3]. However, in frame of the smartphone context-aware application development, the integration of large-scalesensory data that contains real-time spatial, temporal, andthematic information, which could be used for decisionmaking in a rich tactical environment, is crucially difficult [3].

Today, most of these context-aware applications use brute-force approach for collecting and analyzing sensory data thatwastes valuable energy and computation resources because ofgenerating observations of minimal use.

The huge sensory data obtained from smartphone sensorsintensifies the problem of too much data but not enoughknowledge, which is undesirable to smartphone context-aware applications. The problem results due to several rea-sons. First, the heterogeneous sensors produce voluminousdata in varying formats and measurement procedures, whichmakes it difficult for classical Information Retrieval (IR)techniques to help users in searching and retrieving relevantinformation. Second, sensors differ in their values as wellas description terminologies resulting into terminology mis-match, which makes it difficult for keyword-based searchingtechniques to retrieve relevant information. Third, sensorsinherent design characteristics and lack in adaptability tovarying conditions hampers the accuracy and reliability of

HindawiJournal of SensorsVolume 2017, Article ID 8790198, 26 pageshttps://doi.org/10.1155/2017/8790198

Page 2: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

2 Journal of Sensors

the captured data. Fourth, the noisy, asynchronous, varyingsampling rates characteristics of heterogeneous sensors couldresult into missing of valuable data that limits the capabilitiesof applications to statically predefined usages of the collecteddata instead of showing dynamic behaviors. Fifth, the lackand improper definitions of domain and sensors specifica-tions and annotations data can hamper inferencing of domainknowledge from low-level sensory data. Finally, sensors datafusion could enable extraction of knowledge that cannot beperceived or inferred using individual sensors. Therefore,in the light of these problems, the available smartphonecontext-aware applications and frameworks are ill-equippedat handling raw sensory data, where usage of sensory infor-mation is static, predefined, and with no soft integration ofnew data types from the newly emerging sensors. A slightchange in the technologies and conditions would compel forredesigning of an entire application. Furthermore, to developpractically useful application, developers require actionableknowledge, which is not possible from raw sensory mea-surement information [4]. Therefore, like other real-worldsensors-based systems, the data processing, management,and interpretation of smartphone sensory data are a bigchallenge that can be resolved by either smarting applicationsor data. The later approach is more practical by leveragingstate-of-the-art technologies for more meaningful represen-tation and semantic interpretation of smartphone sensorsand sensors observations for using in potential context-awareapplications.

The potential of sensor technology cannot be optimallyexploited until the availability of a well-developed commonlanguage for expressing different aspects of sensors [5]. Sen-sors and sensors observations have already been standardizedfor improving interoperability among heterogeneous sen-sors data repositories and applications [6]. These standards,however, provide synthetic interoperability with no facilitiesof semantic descriptions for computer logic and reasoning[7]. Therefore, Semantic Web technologies could be used toprovide semantic layer to enhance semantic compatibilityand interoperability of smartphone sensors and data. Inthis regard, ontologies allow the annotation of sensory datawith spatial, temporal, and thematic metadata to enhancesemantic understanding, interoperability, and mapping ofrelationships between mismatching terms to improve per-formance of a system [8]. Therefore, smartphone sensorsontology is immensely required to provide a common andwidely accepted language as well as dictionary of terminolo-gies for understanding the structure of information regardingsmartphones, sensors, and sensory observations in order toprovide highly expressive representations, advanced access,reuse smartphone and sensors domain knowledge, formalanalysis of sensors resources and data, and mapping high-level contexts and to make explicit the domain knowledgewithout having knowledge of technical details regardingformat and integration. Furthermore, the ontology wouldallow for classification of the capabilities and observationsof sensors, provenance of measurements, reasoning aboutindividual sensors, connecting a number of sensors as amicroinstrument, semantic interoperability and integration,and other types of assurance and automation. Annotating

sensory data would enhance sensors data fusion and interop-erability between heterogeneous sensors, reused as comparedto syntactic representation, and contextual information forsituation awareness.Theontologywould revolutionize smart-phone context-aware applications by providing a broaderdata model with the potential for integrating new and emerg-ing contents and data types. The ontology would separatethe application knowledge from the operational knowledge,which would enable application and knowledgemanagementeasier and bring semantic interoperability among applica-tions [9].

The objective of this paper is to design and developan ontology for smartphone sensors, namely, SmartOn-toSensor (SOS) that consists of formal conceptualization ofsmartphone in general and smartphone sensors in specificfor context representation. SmartOntoSensor has numerouscharacteristics including (1) semantic annotation of smart-phone and sensors to increase data interoperability, (2) fusionof multisensors data to support intelligent decision making,(3) resolution of sensors data heterogeneity to express datauniformly, (4) description of sensing and measurementscapabilities of sensors to increase sensors data sharingand reusing capabilities, and (5) addition of contexts andcontextual information to support context-aware applica-tions. The SmartOntoSensor framework is kept conservativeas much as possible while keeping the option open forchanges, potential reuse, extension, and plugging into othersmartphone domain-specific heavy-weight ontologies. Thisis due to the speedy evolution in the smartphone’s hard-ware and software industries, which makes it essential thatdecisions made today regarding smartphones and associatedsensors specifications are adoptable and extensible. Thispaper presents a pragmatic approach for the development ofsmartphone sensors ontology with no claim that orthogo-nal, complete, or universally acceptable smartphone sensorsontology is feasible. However, SmartOntoSensor is evaluatedusing state-of-the-art technologies and standards, and theresults are promising. The rest of the paper is organizedas follows: Section 2 briefly presents related work, Section 3presents the proposed SmartOntoSensor ontology, Section 4presents evaluation and discussion, Section 5 presents someof the potential applications of SmartOntoSensor, and finallySection 6 concludes our discussion. References are presentedat the end.

2. Related Work

Sensor networks empower Internet with the acquisition ofcontextual information by observing and measuring real-world incidents and pave the way for the creation of context-aware platforms, applications, and services. However, togain high degree success and adoptability, data captured bydifferent types and levels of sensors in sensor networks needto be utilized productively. Despite the extensive researchefforts in sensors and sensor networks technologies, a univer-sally accepted language for representing sensors’ definitions,properties, taxonomies, performance descriptions, and soforth was needed to enhance data fusion and interoperabilityin a network-centric environment [10]. Therefore, several

Page 3: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 3

researchers have investigated ontologies for semantic repre-sentation of sensors and sensor networks and came up with anumber of ontologies and ontological models [4, 5, 7, 8, 10–15], which are briefly described here.

Avancha et al. [11] used ontology for adaptive wirelesssensor networks enabling sensor nodes adoptable to availablepower, environmental factors, and current operating condi-tions while maintaining calibration and communication. Itsusefulness is the concepts representing a sensor node as asystem having different components and their relationships.OntoSensor [5, 10] is a general knowledge base of sensors forquery and inference and adapted its concepts from SensorML[16], IEEE SUMO [17], and ISO 19115. However, it lacksdistinctive data description model to facilitate interoperabledata representation for sensors observation and measure-ment data. In addition, it provides no constructs to describesensors as process similar to SensorML [7]. Eid et al. [13] havedescribed a two-layer high-level framework for universalsensors ontology in order to describe hierarchical knowledgemodel of sensors, dynamic observational properties of trans-ducers, and the integration of domain-specific ontologieswith the ontology. Important aspects of the ontology areusing notion of virtual sensors formed by physical sensorsfor more abstract measurements and operations. However,the ontology mainly focuses on data and measurementswith little capacity to describe sensors, systems, and howmeasurements are taken. CSIRO [7] is a generic ontology thatorganizes concepts into four core clusters while covering abroad range of concepts for describing sensors, groundings,operational model, process, and measurement. However, theontology contains no concepts for describing components ofplatforms. Coastal Environmental Sensing Networks (CESN)ontology [12] contains description of sensors types, deploy-ment, location, and physical property. The CESN ontologyis, however, very limited of having 10 concept definitionsfor sensor instances and 6 individuals [18]. Semantic SensorNetwork (SSN) ontology [8] has reused DUL (DOLCE UltraLite) upper ontology for describing sensors in terms ofmeasurement capabilities, observations, sensing methods,deployments, and operating and survival ranges along withperformance within those ranges for enhancing discoveringand querying sensors in a network-centric environment.The ontology is more general and comprehensive becauseof providing most of the necessary details about differentaspects of sensors and measurements and can pave the wayfor the construction of any domain-specific sensors ontology.SWAMO ontology [19] provides interoperability betweensensor web products and services. It includes concepts forsensors and actuator and enables autonomous agents forsystem-wide resource sharing, distributed decision mak-ing, and automatic operations. The ISTAR ontology [14]solves the problems faced by intelligence and surveillanceincluding dynamic selection and assignment of sensors(i.e., depending on requirements, fitness, etc.) for individualtasks in a mission. Service-oriented sensor network ontol-ogy [15] is developed using Geography Markup Language(GML), SensorML, SUMO, and OntoSensor ontology fordescribing sensors services. The ontology emphasizes devel-oping sensors descriptions ontology for sensors discovery

and description of sensors metadata in a heterogeneousenvironment. Korpipaa and Mantyjarvi [4] have designedontology for mapping raw sensory data from mobile devicesfor high-level semantic description of composite contexts.The ontology encourages the quick development of sensors-based mobile applications, more efficient use of developmentand computing resources, and reuse as well as sharing ofinformation between communicating entities. However, theontology provides no description of sensors and platformsand their relationships with contexts. OOSTethys [20] is anobservation-centered ontology describing observation as aprocedure for estimating property value of a feature of interestand process as a system comprising other systems or atomicprocesses.

Sensors used in different applications including homeappliances, robotics, military, and earth sciences can haveanalogous characteristics but their needs and usage makethem unique [21]. For example, sensors used in weather fore-casting measure basic physical properties including air pres-sure, humidity, temperature, and wind speed [21], whereassensors used in military missions provide information abouthostile terrain such as tactics, location, movement, strength,and equipment and in the development of countertactics andstrategies [10]. Smartphone sensors have the same sensingcapabilities but with more potential usage than sensor nodesfound in sensor networks. This is because, in addition tosensing capabilities, these sensors are locally supported byrich processing, storage, and communication capabilitiesthat are integrated in a single unit and turn smartphonesinto smart sensors. However, the available ontologies forsensors and sensor networks [4, 5, 7, 8, 10–15] cannotbe applied directly for smartphone sensors-based context-aware computing due to a number of shortcomings. Theseinclude (1) variations in scope and completeness, where theontologies were exclusively developed for sensors and sensornetworks by describing heterogeneous sensors nodes andenhancing data fusion and interoperability in a network-centric environment, (2) lack of unified ontology frameworkand consistency in definitions of concepts leading to poorreuse and sharing, and (3) lack of expressiveness due to lowexplicit hierarchy of concepts and poor logic of relationshipswhich could result into unsatisfactory reasoning [22]. Inaddition, technically they tend to be shallow and providesuperficial aspects of sensors that are expressed in taxonomiesand captured as class hierarchies [10]. Therefore, none of theexisting sensors and sensor networks ontologies is detailedenough to provide constructs that could be applied directlyto meet the unique needs, features, and applications ofsmartphone and associated sensors in real-world scenarios.However, these ontologies contain important ingredients(i.e., concepts, relationships, and axioms) that can be reusedto provide necessary grounds and understanding for smart-phone sensors ontology.

3. SmartOntoSensor Ontology

The lifestyle of people changes with the developments in thesociety resulting into new events, interactions, and needs.Smartphones, because of their sensing capabilities, have the

Page 4: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

4 Journal of Sensors

potential to capture these aspects and understand the needsof the users. However, due to complexities in understandingusers’ lifestyles, smartphone sensors are needed to perceivecomplex objects and their actions as well as interactionseffectively under varying operating conditions, strict powerconstraints, and highly dynamic situations. Furthermore,smartphone-based context awareness works by capturingexcessive sensory measurements for inferring complex con-textual information including information about environ-mental conditions, identities of objects in the environment,physical activities of objects and their positions as well asinteractions, and the undergoing tasks. Such context-awaresmartphone sensors applications demand for comprehensivesemanticmodeling of smartphone sensors data. It is observedthat the role of smartphone sensors ontologies is inevitablefor improving the power of smartphone context-aware appli-cations. In order to meet the unique needs and applications,comprehensive smartphone sensors ontology is demanded,which consists of domain theory, represented in a language,and constructed using the functional and relational basis tosupport ontology-driven inference.

The intended purpose of the SmartOntoSensor is todevelop an ontological model consisting of formal conceptu-alization of smartphone resources and sensors including theircategories, taxonomy, relationships, and metadata regard-ing sensor characteristics, performance, and reliability. Inaddition, the SmartOntoSensor includes logical statementsthat describe associations among components and sensorconcepts as well as aspects of their operating principles, com-puting and capabilities, platforms, observations andmeasure-ments, and other pertinent semantic contents. The primaryobjectives of SmartOntoSensor include the following.

(i) Providing a semantic framework for capturing theimportant functionality features of smartphone sen-sors enabling context-aware applications to reason theavailable and running sensors for applying them tocurrent information needs, querying, and retasking asneeded.

(ii) Providing context-aware applications with a semanticinterface for managing, processing, integrating, andmaking sense of data acquired from a set of heteroge-neous smartphone sensors.

(iii) Providing semantic description of smartphone sen-sors for reasoning available sensors capabilities andperformances to construct low cost combinations ofsensors for achieving goals of an operation.

(iv) Providing basis for new measurement methods toevaluate each perception system’s ability (sensors andalgorithms) to perform the required tasks.

The development of SmartOntoSensor is an attempt toprovide a common understating of data captured by smart-phone sensors to increase information value and reusabilityfor the application development and sharing. The goalsand design principles suggested by [4] are followed in thedevelopment of the ontology to ensure its coverage, validity,and usability:

(i) The ontology has been developed for representinginformation in domain of smartphone and sensorsand mapping sensory information into high-levelcontexts to be used in a variety of context-awareapplications.

(ii) The ontology describes concepts, relations, andexpressions in a simple and easy-to-understandman-ner to be easily and effectively used by applicationdevelopers.

(iii) The ontology is flexible and extendable as it allowsdevelopers/users with minimal overhead to add newdomain-specific concepts and complementary rela-tions in order to enhance interoperability and knowl-edge sharing among smartphone context-aware appli-cations.

(iv) The ontological representation facilitates inference byallowing developers to employ an efficient inferencemethod using recognition engines or applicationcontrol.

(v) The ontology is general by describing concepts, facets,and relationships that are possibly applicable to awiderange of smartphone platforms, embedded sensors,data formats, measurement capabilities and opera-tions, and contexts.

(vi) The ontology is memory-efficient and supports time-efficient inference methods. The imports and con-structs in the ontology are defined while keeping inmind the limited nature of memory and processingresources of smartphones; that is, a smartphoneshould not be jeopardized during inferencing task.

(vii) The ontology provides detailed information aboutingredients and the versatility of expressions ishigh. The ontology is lightweight but comprehensiveby declaring enough concepts and relationships todescribe possibly every aspect of the domain ofinterest.

(viii) The concepts and properties in the ontology aredefined and arranged precisely for ease in accessand producing potentially high values for quality andcompleteness metrics. In addition, the ontology canbe easily used in any of the smartphone SemanticWebframeworks such as AndroJena.

3.1. Materials and Methodology. In developing SmartOn-toSensor, the NeOnmethodology [23] is used, which empha-sizes the searching, reusing, reengineering, and mergingof ontological and nonontological resources and reusingontology design patterns, which are the main designingrationales of SmartOntoSensor. However, the NeOnmethod-ology [23] is lacking with ontology project managementfeatures, which are adopted from the POEM methodology[24]. Good ontological engineering emphasizes leveragingof upper and related ontologies consisting of general ingre-dients and providing a common foundation for definingconcepts and relationships in a specialized domain-specificontology. The use of standard ontologies implies shorter

Page 5: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 5

Smartphone Sensor StimulusObservation

value Context

Figure 1: 3SOC ontology design pattern for SmartOntoSensor.

Table 1: An excerpt of the lexicon.

LexiconSmartphone WiFi Type IMEIAccuracy Resolution Humidity LatitudeLocation Physical sensor Service CountryManufacturer Accelerometer Sensing RollBluetooth Observation value Logical sensor Manufacturer emailTemperature Hardware Profile MultitouchVersion Passive sensing Time UnitInput Output Single output Event statusPitch Active sensing Observation Indoor range

development cycles, universalism, initial requirements setidentification, easier and faster integration with other con-tents, and more stable and robust knowledge systems [13].The Content Ontology Design pattern [25] is used wherecontents in SOS are conceptual and instantiated from log-ical upper-level ontologies and provide explicit nonlogicalvocabulary for the domain of interest. Furthermore, Com-ponency Pattern is used for representing classes/objectsas either proper parts of other classes/objects or havingproper parts. Technically, the Stimulus-Sensor-Observationpattern [8] is extended into Smartphone-Sensor-Stimulus-ObservationValue-Context (3SOC) design pattern (shown inFigure 1) in order to represent the relationships and flowof information between the components from inception toapplication. Verbally, a smartphone system contains sensorsfor detecting stimulus containing observations for producingan observation value, which could be used to identify acontext for invoking a service to take an appropriate action.

3.1.1. SmartOntoSensor Requirements Specifications. Theontology requirements specification activity is performedfor identifying and collecting requirements that theontology should fulfill. Ontology Requirements SpecificationDocument (ORSD) is formed explaining (1) purpose andreasons to build the ontology; (2) scope of the ontology tofuel applications for mapping smartphone sensory data intohigh-level contexts for adopting services according to thecontexts; (3) users and beneficiaries of the ontology whowill be developing applications/services that interact withsmartphones and services; (4) ontology as a knowledge baseto store data about smartphone, resources/devices, sensors,measurement capabilities and properties, contexts, services,profiles, and so forth; (5) and degree of formality of theontology by implementing in OWL-DL to get maximumexpressiveness with computational completeness.

The SmartOntoSensor requirements are mainly con-cerned with nonfunctional and functional requirements.The nonfunctional requirements comprise terminological

requirements (i.e., collection of terms used in the ontologyfrom the standards that could be used to express competencyquestions) and the naming convention used for the terms.The terminological requirements of SmartOntoSensor canbe broadly divided into several categories: (1) base termsrepresent the basic classes of entities in the domain ofinterest, which could be further extended into subclasses;(2) system terms represent components, subcomponents,resources, deployments, metadata, and so forth of a system;(3) sensor terms represent types, characteristics, processes,operations, configuration, metadata, and so forth of sensors;(4) observation terms represent input, output, responsemodel, observation condition, and so forth of observationsthat are used and produced by sensors; (5) domain termsare used for units of measurement, features selections andcalculations, sampling patterns definitions, and so forth; (6)context terms are used for recognizing a context such as loca-tion, time, event, activity, and user; and (7) storage terms areused for the storage units used for storing sensors capturedobservations and other data such as file and folder. A lexiconrepresenting a set of terminologies in the problem domainand applications is identified and collected from application-specific and domain-specific documents. An excerpt of thelexicon is reported in Table 1.

The SmartOntoSensor functional requirements repre-sent the intended tasks and are represented in competencyquestions, which the ontology should answer by executingSPARQL queries such as “what is a smartphone location?,”“what is the sensing accuracy of a smartphone X sensor?,”“which of a smartphone sensors could be used for recog-nizing a Y context?,” “what is the accelerometer sensor x-axis observation value for sitting context,” “what are theenvironmental conditions for a sensor to work?,” and “whatare the humidity level of an environment?”The set of possiblecompetency questions would help in determining the cor-rectness, completeness, consistency, verifiability, and under-standability of requirements. The domain characteristics,which are difficult to express in competency questions, are

Page 6: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

6 Journal of Sensors

Table 2: An excerpt of the competency questions.

Competency question QuestionCQ1 What is status of a smartphone?CQ2 Is accelerometer sensor available in a smartphone?CQ3 What is location of a smartphone?CQ4 What is the output of accelerometer sensor?CQ5 Which sensors can recognize a running context?CQ6 What are the device conditions for GPS sensor to work?CQ7 Who is user of a smartphone?CQ8 Which service would be invoked if sleeping context is detected?CQ9 What are the activities composing a sensing process?CQ10 Which of the sensors are used by accelerometer as supporting sensors for detecting a running context?

Table 3: Relationship between scenarios and iterations.

Scenarios Iterations1st 2nd 3rd

2 Reusing and reengineering nonontological resources ✓3 Reusing ontological resources ✓4 Reusing and reengineering ontological resources ✓5 Reusing and merging ontological resources ✓ ✓6 Reusing, merging, and reengineering ontological resources ✓ ✓7 Reusing ontology design patterns (ODPs) ✓ ✓ ✓8 Restructuring ontological resources ✓9 Localizing ontological resources ✓

written in natural language sentences such as “fusion of datafrom multiple sensors for mapping a context” and “extremeenvironmental conditions can affect sensors observationsand performances.” The initial iteration produces short listof functional requirements. However, the list is improvedsignificantly in the subsequent iterations and consists of 156competency questions and 40 domain characteristics. Anexcerpt of competency questions is shown in Table 2.

3.1.2. SmartOntoSensor Development Resources and Tools. Byfollowing the NeOn methodology [23], SmartOntoSensor isdeveloped by reusing the existing knowledge resources. Thedevelopment task is divided into three iterations where bothontological and nonontological resources are reused in thefirst and second iterations, and ontology design pattern isincluded in all of the development iterations. Table 3 showsthe selected scenarios to be carried out in combination withScenario 1.

SmartOntoSensor is constructed by reusing multiple rel-evant ontologies. The review and analysis of the sensors andsensor networks ontologies and sensors vocabularies havehighlighted that reusing available sources describing sensors,their capabilities, the systems they are part of, observationand measurements, properties and associations, quantitativevalues for properties, and so forth can provide promising startfor building SmartOntoSensor. After thoroughly analyzingthe sensors and sensor network ontologies, the SSN ontology[8] is foundmost relevant due to providing advanced schemafor describing sensor equipment, observation measurement,

and sensor processing properties. SSN has a wider rangeof generality and extension space and is reused in severalprojects to solve complex problems [26]. Therefore, SSN isextended for the development of SmartOntoSensor. Otherontologies are also found containing relevant ingredientsbut excessive imports can cause certain problems includingdecrease in efficiency, simplicity, consistency, verifiability,and flexibility [27]. Some categories, taxonomies and defi-nitions of commonly used concepts, properties, and meta-data are adopted in part from SensorML [16]. Althoughthe initial objective was to faithfully replicate the requireditems from SensorML, some implementation compromisesand workarounds are made exclusively to meet the uniquedemands of the new paradigm. The context ontology (CXT)developed by CoDAMoS project [28] is imported andextended with required domain concepts and relationshipsfor modeling context in SmartOntoSensor. Sensors data arestream requiring indefinite timestamp sequence informationfor unique representation. Therefore, OWL Time (TIME)ontology is reused for incorporating time information intothe SmartOntoSensor. To develop SmartOntoSensor, classesin the imported ontologies are either used directly orextended by making SmartOntoSensor classes as subclassesof the relevant classes. Furthermore, classes in the importedontologies representing the same concepts are aligned bydeclaring them equivalent classes and other classes arerefactored either by restructuring the class hierarchy or bydefining new associations and relationships for enablingthem to be used in SmartOntoSensor as per needs. The

Page 7: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 7

SmartOntoSensor

SSN ontology

SensorML

extends

replicatesLiterature

includes

Time ontology

extends

Context ontology

extends

Figure 2: Abstract level structure of SmartOntoSensor.

additional domain-specific contents of SmartOntoSensor arecaptured from the detailed investigation and analysis of therelated literature. Figure 2 shows the abstract level structureof SmartOntoSensor by highlighting the contributing infor-mation sources. To communicate semantics of sensors obser-vations, an appropriate terminology (obtained in lexicon) isdefined for describing concepts, relations, and processes.Theterminology used is discussed in the subsequent sections.After formally defining the constituents, SmartOntoSensor isdeveloped in OWL-DL language using open source ontologyeditor and knowledge-based framework Protege 4.3 alongwith its exclusive plug-ins (e.g., SPARQL query plug-in andRacerPro reasoner) for the ontology editing, development,implementation, and testing.

3.2. SmartOntoSensor Framework. The SmartOntoSensorframework is conceptually (but not physically) organizedinto 9 modular subontologies where a modular subontologyrepresents a subdomain and a central ontology links the otherontologies. Each of the subontologies contains a number ofconcepts and properties for modeling a specific aspect ofthe domain of interest. The SSN framework [8] has pro-vided inspiration for the development of SmartOntoSensorframework and some of the its modules are refactored andmerged for inclusion into the SmartOntoSensor framework.The novelty of the framework is fourfold information pre-sentations to fulfill objectives of SmartOntoSensor. First,detailed information about smartphone systems regardingtheir hardware components, software, platforms, metadata,potential deployments, and so forth is presented. Second,the detailed information about smartphone sensors regardingtheir inputs, outputs, processing, observations, measure-ments, operating restrictions, capabilities, and so forth ispresented.Third, potential applications of observation values(i.e., produced by smartphone sensors after sensing process)for context recognition and context modeling as a wholesuch as user information and profiles, current and plannedactivities and events, and device and its surroundings are pre-sented. Fourth, enhancing of smartphone context-awarenesscapability for solving the challenges of applications adoptabil-ity according to users’ contexts is presented. Figure 3 depictsthe SmartOntoSensor framework and a snippet of the main

concepts and their relationships. The framework representsthe main high-level concepts and object properties in asubdomain, whereas detailed concepts and object propertiesare left aside. In the framework, classes are represented withrectangles and subclass axioms and object properties arerepresented with solid and dotted arrow lines, respectively.Thedetailed discussion on the SmartOntoSensor framework’ssubdomains is presented in Sections 3.2.1 to 3.2.8.

3.2.1. Smartphone. The smartphone subdomain is con-structed around using concepts for modeling knowledgeabout a smartphone system describing its resources (i.e.,hardware and software), organization, deployment, andplatform aspects. The SOS:Smartphone concept is derivedfrom the SSN:System concept of the SSN ontology [8]providing necessary properties for deployment and platform.SOS:Smartphone could have different hardware resourcesincluding sensing (SOS:SensingResource), memory (SOS:MemoryResource), and network (SOS:NetworkResource)and software resources including operating system (CXT:OperatingSystem) andmiddleware (CXT:Middleware). Eachof these resources is a system by itself. A number ofobject properties (i.e., SOS:hasBluetooth, SOS:hasWiFi,SOS:hasSensor, hasOperatingSystem, etc.) are made assubproperties of SSN:hasSubSystem for linking SOS:Smartphone with resources. The SOS:Smartphone inheritsSSN:hasOperatingRange and SSN:hasSurvivalRange proper-ties to define the extremes of operating environments andother conditions in which a smartphone is intended tosurvive and operate for providing functionalities includingstandard configuration, battery lifetime, and system lifetime.A smartphone could be mounted or connected (SSN:onPlatform) with a platform (SSN:Platform lCXT:Platform)which could be having hardware (CXT:providesHardware)and software (CXT:providesSoftware) features. The SOS:Smartphone could be deployed (SSN:hasDeployment→ SSN:Deployment) at a specific place including worn on a helmet(SOS:Halmet ⊆ (SSN:Platform l CXT:Platform)) or aroundthe neck, attached with belt (SOS:Belt ⊆ (SSN:Platform lCXT:Platform)), placed on a selfie stick (SOS:SelfiStick ⊆(SSN:Platform l CXT:Platform)), and placed in treasurepocket. SSN:Deployment could be a complete process of

Page 8: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

8 Journal of Sensors

SmartPhoneSmartphoneModule

SmartphoneDeployment SmartphonePlatform

MetadataModule

onPlatformhasDeployment

ResourcehasResource

ResourcesModule

Metadata

Sensor

Sensor Module

TypeOfOperationhasTypeOfOperation

MeasurementCapability

implements

ObservationValue

detects

Context

isRegionFor

isPropertyOfSpace

Time

Theme

SerialNumber

Note

hasSubSystem

deploymentProcesPart Hardware

deployedOnPlatforminDeployment

providesHardware

providesSoftware

QualityType

constructsSensorOutput

includesEventSensorInput

Property

observes

FeatureOfInterest

isProducedBy

featureOfInterest

isProxyForobservedProperty

hasProperty

Observation

hasValue

Sensing

hasSubProcessParameters

OutputInputhasInput

hasOutput

ControlStructure

hasProcessType

ProcessModule

MLT

implement

Model

Manufacture

hasModel hasManufacturer

ObservationModule

OperatingConditionhasOperatingCondition

hasMeasurementCapability

inCondition

forProperty

MeasurementProperty

hasMeasurementProperty OperationalCapabilites

Module

identifyContext

canRecognizeContext

hasSpace

hasTheme

hasFeaturesUser Event

ServicestartService

hasObservationLocation

Context andService Module

Opportunis Participatory

ConditionLatency Accuracy

withAccuracywithLatency

Version

hasVersion

Size

hasSize

Location

Mood

hasMood

Role

hasRole

Profile

hasProfile

Software hasSerialNumber

hasNote

hasSmartPhoneMetadata hasMetadata

hasSensorMetadataSensingResource

Design

isDescribedBy

LogicalPhysical

constructedFrom

measure

isMeasureByproducesrelatedOutput

obse

rved

By

RoleisClassifiedBy

hasHardware

hasSoftware

observationResultAtomic Composite

GeonamesFeatures

Person

TimeModule

hasO

bser

vatio

nTim

e

ProcessCalibration

Maintenance

hasTime

Figure 3: SmartOntoSensor framework.

installation, maintenance, and decommission and couldhave spatial (SOS:hasDeploymentPosition) and temporal(SOS:hasDeploymentTime) properties.

3.2.2. Sensor. The sensor subdomain is the cornerstone ofSmartOntoSensor that serves as a bridge for connecting allother modules. It models smartphone sensors using conceptsand properties for describing taxonomy of sensors, typesof operations, operating conditions, resource configurations,measuring phenomena, and so forth. During the alignmentprocess between SSN and SmartOntoSensor, the SSN:Sensorconcept is extended by a detailed hierarchy of sensors inSmartOntoSensor. The SSN:Sensor is categorized into logi-cal (SOS:LogicalSensor) and physical (SOS:PhysicalSensor)sensors where physical sensors represent hardware-basedsensors and logical sensors represent software-based sensorsthat are created by employing one or more physical sensors.Individual sensors (e.g., SOS:Accelerometer) are declaredsubclass of either physical or logical sensors. A physicalsensor has type of operation (SOS:hasTypeOfOperation)which could be either active sensing (SOS:ActiveSensing) orpassive sensing (SOS:PassiveSensing). A physical sensor hasto work under certain conditions and have specific featuressuch as SSN:Accuracy, SSN:Latency, SOS:Hystheresis, and

SSN:Sensitivity to capture a particular stimulus. A sensorcould have its own hardware (SOS:hasSensorHardware)and software (SOS:hasSensorSoftware) specifications. Asensor can measure (SOS:measure) properties of a phe-nomenon that can be quantified (i.e., that can be per-ceived, measured, and calculated) which could be eitherphysical quality (SOS:PhysicalQuality) or logical quality(SOS:LogicalQuality). A sensor detects (SSN:detects) a stim-ulus (SSN:Stimulus) which is an event in the real world thattriggers a sensor and could be the same or different to observeproperty and serves as a sensor input (SSN:SensorInput).Theoutput (SSN:“Sensor Output”) produced by a sensor can beeither a single value (SOS:SingleValue) or more than onevalue (SOS:TupleValue).

3.2.3. Process. In addition to physical instrumentation, a sen-sor has associated functions and processing chains to producevaluable measurements. The sensor concept has to imple-ment (SSN:implements) a sensing process (SSN:Sensing),which could be either participatory (SOS:Participatory) oropportunistic (SOS:Opportunistic). A process can have asubprocess (SOS:subProcess) such that a sensing processcould require a prior calibration process. Therefore, con-cepts for calibration (SOS:Calibartion) and maintenance

Page 9: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 9

(SOS:Maintenance) processes are declared as subclassesof SSN:Process and are disjointed with SSN:Sensing. Asensing process could have input (SOS:Input) and out-put (SOS:Output) parameters. A sensing process can usecalibration and maintenance processes as complementaryand supporting classes using SOS:SubProcess property.A process has type (SOS:hasProcessType) either physical(SOS:PhysicalProcess) or logical (SOS:LogicalProcess) wherea physical process would be essentially having a physicallocation or interface and a logical process do not. A sensingprocess has process composition describing its algorithmicdetails (i.e., sequence, condition, and repetition) of howoutputs are made out of inputs. Therefore, a process hascontrol structure (SOS:hasControlStructure) for linking withinstances of SOS:AtomicProcess, SOS:CompositeProcess,and SOS:PhysicalProcess. A sensing process could implement(SOS:implement) machine learning techniques (SOS:MLT)for extracting features of interest from an input to create anoutput.

3.2.4. Measurement. The measurement subdomain is con-structed for complementing sensor perspective. The mainconcepts in this subdomain are SSN:Observation, SSN:FeatureOfInterest, and SSN:ObservationValue. A sensor hasan observation (SSN:Observation) representing a situationin which a sensing method (SSN:Sensing) estimates thevalue of a property (SSN:Property) of a feature of interest(SSN:FeatureOfInterest) where a feature is an abstractionof a real-world phenomenon. An observation is formedfrom a stimulus (SSN:Stimulus) in a contextual event (SOS:Event) which serves as input to a sensor. A sensor input is aproxy for a property of the feature of interest. An observation(SSN:Observation) should represent an observing property(SSN:observedProperty), sensing method used for observa-tion (SSN:sensingMthodUsed), quality of the observation(SSN:qualityOfObservation), observation result (SSN:obser-vationResult), and the time (CXT:Time) at which thesampling took place (SSN:observationSamplingTime). Anobservation result is a sensor output (SSN:SensorOutput),which is an information object (SSN:SensorOutput ⊆ SSN:InformationObject). The information object represents ameasurement construct for interpreting events, partic-ipants, and associated result and signifies the interpretativenature of observing by separating a stimulus event fromits potential multiple interpretations. SSN:SensorOuput hasvalue (SSN:hasValue) for observation value (SSN:Observa-tionValue). The SSN:ObservationValue represents the encod-ing value of a feature and is indirectly depending on theaccuracy, latency, frequency, and resolution of a sensor pro-ducing output. The observation value is annotated with loca-tion (SOS:hasObservtionLocation), theme (SOS:hasObser-vationTheme), and time (SOS:hasObservationTime) infor-mation for identifying a context (SOS:identifyContext).

3.2.5. Capabilities and Restrictions. Another complementingsensor subdomain is the sensor measurement capabilitiesand operational restrictions. Sensors are integrated withinthe suit of a smartphone; however, they are intended to beexposed and operated to provide best performancewithin the

defined operating conditions (SOS:OperatingCondition ⊆SSN:Condition), which are categorized into device conditions(SOS:DeviceCondition) and environmental conditions(CXT:EnviromentalCondition) of the atmosphere (i.e., SOS:Humidity, SOS:Temperature, and SOS:Pressure) in aparticular space and time. A sensor has inherent character-istics by design of measurement capabilities (SSN:MeasurementCapabilities) that depends on themeasurementproperties (i.e., representing a specification of a sensor’smeasurement capabilities in various operating conditions)and directly affects a sensor’s output. The measurementproperties (SSN:MeasurementProperty) determine thebehavior, performance, and reliability of a sensor. In addition,these measurement properties determine the quality ofquantifiable properties such as mass, weight, length, andspeed related to a phenomenon. These properties can beclassified into accuracy (SSN:Accuracy), frequency (SSN:Frequency), power consumption (SOS:PowerConsumption),random (SOS:Random) and systematic (SOS:Systematic)errors, settling time (SOS:SettlingTime), precision (SSN:Precision), and resolution (SSN:Resolution).

3.2.6. Metadata. The metadata subdomain is constructedto provide detailed descriptive information regardingorigination, introduction, application, and productionof an object or a phenomenon that is of interest to adecision making system. The metadata determines andaffects the reliability and credibility of an object or aphenomenon. A vision of SensorML is that the schemashould be self-describing and can be accomplished byaccommodating metadata about a schema within theschema [10]. An object (e.g., sensor) would have metadata(SOS:hasSensorMetadata) that provides information aboutmanufacturer (SOS:hasManufacturer), model information(SOS:hasModel), serial number (SOS:hasSerialNumber),and size (SOS:hasSize). SOS:Metadata has explicit rela-tionships with SOS:Identification, SOS:Note, and SSN:Design. SOS:Identification provides information aboutrecognition of a phenomenon including manufacturer,model, size, and version. Manufacturer perspective (SOS:Manufacturer) is constructed by providing necessary infor-mation about manufacturer of a smartphone, resource,platform, and sensor. SOS:Manufacturer is enriched withseveral object properties for describing a manufacturerthat includes SOS:hasManufacturerEmail, SOS:hasManu-facturerName, SOS:hasManufacturerLocation, and SOS:hasManufacturerWebsite. Similarly, additional informationabout objects are provided by establishing explicit rela-tionships with SOS:Model, SOS:SerialNumber, SOS:Version,and SOS:Size. Similarly, SOS:Note provides a descriptionabout a phenomenon.

3.2.7. Time. The time subdomain models knowledge abouttime such as temporal unit and temporal entities. Thissubdomain has been developed by reusing the OWLTime ontology, in which TIME:CalendarClockDescription,TIME:TemporalUnit, and TIME:TimeZone are made sub-classes of CXT:Time. Similarly, day, week, month, second,minute, and hour properties of Time ontology are used to

Page 10: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

10 Journal of Sensors

represent time information. SSN:TimeInterval is made sub-class of CXT:Time to represent time duration of a phe-nomenon.

3.2.8. Contexts and Services. The context subdomain isconstructed for describing the application of sensors’generated observation values for identifying user contexts.Ontology-based context modeling will be helpful informal and semantic enrichment and representation ofcomplex context knowledge in order to share and integratecontextual information [28]. The context ontology fromCoDAMoS project is reused for providing relevant coreknowledge for SmartOntoSensor, which is extendedwith more detailed context types and properties. The SOS:Context ismain concept in this subdomain,which is classifiedinto more specialized contexts including CXT:Activity,CXT:Environment, SOS:Event, SOS:Device, ontology:User,and SEN:Location. Each of the subcontexts representsa subdomain which could have more specific contextssuch that ontology:Activity is classified into SOS:Motionaland SOS:Stationary and SOS:Event is classified intoSOS:SpatialEvent, SOS:TemporalEvent, SOS:SpatioTem-proalEvent, and so on. The SOS:Device subdomain modelsknowledge about devices and includes a wide categorizationof devices as well as their characteristics. SOS:Environmentsubdomain models knowledge about environment in termsof environmental conditions such as humidity, noise,light, and temperature. CXT:Location models knowledgeabout locations such as buildings, location coordinates,spatial entities, distance, and countries. For more detailedlocation modeling, CXT:Location is linked with geonames:GeonamesFeature. CXT:User (i.e., CXT:User l SSN:Person)subdomain models knowledge about users such as roles,profiles, preferences, tasks, projects, publications, and social-ization. For more detailed user mappings, CXT:User islinked with FOAF:Person. SOS:Event subdomain modelsknowledge about users’ real-life events and includes awide categorization of events using space, time, and othercharacteristics. CXT:Activity subdomain models knowledgeabout users’ motional and stationary activities and includestheir characteristics. A context needs spatial (SOS:Space),temporal (CXT:Time), and thematic (SOS:Theme) infor-mation for its description. A context is identified by theobservation value (SSN:ObservationValue) that depends on asensor output (SOS:SensorOutput). CXT:Service subdomainfulfills the service-oriented requirement of the ontology byproviding service-oriented features in ubiquitous computing.A service would be software and includes services whichwould be recognized and utilized on the basis of identifiedcontext. A service could have a provider (SOS:Provider) thatwould include simple or aggregated service providers.

3.3. SmartOntoSensor Concepts Hierarchy. SmartOntoSen-sor taxonomic class diagram, forming foundation of theontology, is constructed from concepts, which are commonand specific to smartphones, sensors, and context applica-tions. The identified concepts are hierarchically arrangedby determining their relationships that whether a conceptis a subconcept of another concept or not. The required

classes, which are provided by either of the importedontologies (e.g., SSN:Input, SSN:Output, CXT:Software, andSSN:Precision), are directly used in SmartOntoSensor andno explicit classes are declared to eradicate any type ofambiguity. Furthermore, classes in the imported ontologiesrepresenting the same semantics are declared as equivalentclasses (e.g., SSN:Platform l CXT:Platform). New classes arecreated explicitly either as parent classes or as subclassesof the relevant classes in the imported ontologies or as perrequirements. Figure 4 shows a snippet of SmartOntoSensorconcepts hierarchy.

The SmartOntoSensor, at present, contains 259 concepts,where each concept is formed by keeping in mind the uniqueneeds and requirements of the domain. SmartOntoSensorextends SSN ontology by making SOS:Smartphone ⊆SSN:System, which means SOS:Smartphone is a kindof SSN:System. Smartphone platforms (SOS:Halmet,SOS:Belt, and SOS:SelfiStick), to which a smartphonecan be attached during sensing process, have uniquecharacteristics and are made subclass of SSN:Platform lCXT:Platform to partially satisfy needs of a smartphoneplatform. The SSN:Sensor concept is used to representsensing devices for SmartOntoSensor to capture inputsand produce outputs. The SSN:Sensor concept is furtherdivided into two categories SOS:PhysicalSensor andSOS:LogicalSensor to represent hardware-based sensors andsoftware-based sensors, respectively, in a smartphone. ASOS:LogicalSensor is formed by employing one or more ofthe SOS:PhysicalSensor for data capturing and producinga unique output; for example, e-compass sensor is formedusing accelerometer and magnetometer sensors. Real-worldsmartphone physical sensors such as SOS:Accelerometer,SOS:Gyroscope, SOS:Camera, and SOS:GPS are made assubclasses of SOS:PhysicalSensor and logical sensors such asSOS:Magnetometer and SOS:Gravity are made as subclassesof SOS:LogicalSensor.The sensors hierarchy can be extendedat any level to include more detailed and specific types ofsensors. For example, accelerometer sensor can sense motionin either 2-dimensions or 3-dimensions. Specifically, 1st- and2nd-generation accelerometer sensors can detect motion ineither of these categories while 3rd-generation accelerometersensors can detect motion across both categories as shownin Figure 5. The quantifiable properties that a sensor canmeasure (SOS:measure) for a particular phenomenon areclassified into physical quality (SOS:PhysicalQuality) orlogical quality (SOS:LogicalQuality). A sensor has typeof operation, representing how a sensor would sensea stimulus that is either active or passive. Therefore,SOS:ActiveSensing and SOS:PassiveSensing are madeas subclasses of SOS:TypeOfOperation. The conceptsSOS:DeviceCondition and CXT:EnvironmentalConditionare declared as subclasses of SOS:OperatingCondition.The metadata concepts including SOS:SerialNumber andSOS:Size are declared as subclasses of SOS:Identification,which would be used by SOS:Metadata to provide necessarymetadata about smartphones, sensors, and platforms. Asmartphone sensor has to perform sensing operations, whichcould be either opportunistic or participatory. Therefore,SOS:Opportunistic and SOS:Participatory are made as

Page 11: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 11

SSN:PhysicalObject

SSN:Sensor

SSN:System

CXT:Resource

SOS:SensingRes

SOS:SmartPhone

SOS:LogicalSensors

SOS:PhysicalSensors

SOS:ActiveSensing SOS:PassiveSensing

SOS:TypeOfOperation

Thing

SOS:Identification

SOS:Accelerometer

SOS:Proximity SOS:Magnetometer

SSN:Process

SOS:Calibration SSN:Sensing

SOS:Participatory SOS:Opportunistic

SOS:QualityType

SOS:PhysicalQuality SOS:LogicalQuality

SOS:Context

CXT:User

CXT:Location

SOS:Parameters

SSN:Input SSN:Output

SSN:Event

SSN:SensorInput

SSN:InformationObject

SSN:SensorOutput

SOS:OperatingCond

SOS:DeviceCond

CXT:EnvirmentalCon CXT:Service

SOS:MicroPhone

SOS:Activity

SOS:SerialNumber SOS:Size

SOS:SingleOutput SOS:TrippleOutput

CXT:SoftwareCXT:Hardware

SOS:ProcessDesign

SOS:ControlStructure

SOS:Environment

SSN:Platform

SOS:Halmet SOS:SelfiStick

SSN:Agent

FOAF:Person

SSN:Description

SOS:Metadata SOS:Identification

SOS:Manufacturer

CXT:Time

Time:TimeZone

SSN:TimeIntervalCXT:Profile

SOS:PersonalProfile

Figure 4: Snippet of SmartOntoSensor concept hierarchy.

SSN:Sensor

SOS:LogicalfSensors SOS:PhysicalSensors

SOS:LocationSensor SOS:MotionSensor OrientationSensor SOS:PressureSensor

SOS:Accelerometer

SOS:BiAxialAccelerometer SOS:TriAxialAccelerometer SOS:HybridAccelerometer

SOS:ImageSensor

Figure 5: Snippet of SmartOntoSensor “sensor” class hierarchy.

subclasses of SSN:Sensing class. Sensors differ by the amountof output produced where some could produce a single valueoutputs whereas others could produce triple value outputs.Therefore, SOS:SingleOutput and SOS:TrippleOutput aremade as subclasses of SSN:SensorOutput. An observationvalue produced by a sensor as output could be used torecognize SOS:Context, which is divided into subclasses as

SOS:Event, SOS:Device, CXT:User, and CXT:Environment.An identified context can start a CXT:Service, which wouldbe a software and is made subclass of CXT:Software. Inaddition to all of the above, several concepts are explicitlyidentified and included in SOS including SOS:RAM,SOS:WiFi, SOS:CPU, SOS:GSM, SOS:Bluetooth, SOS:NFC,SOS:Infrared, SOS:GPU, and SOS:SDCard, as subclasses

Page 12: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

12 Journal of Sensors

of CXT:Hardware, SOS:Random, and SOS:Systematic assubclasses of SOS:Error, SOS:Calibration, and SOS:Main-tenance as subclasses of SSN:Process, SOS:Theme, and SOS:Space to provide a comprehensive set of information forimproving inferencing mechanisms.

Apart from using subclass axioms, classes are coupledwith other axioms to facilitate the creation of individuals (i.e.,objects) unambiguously and semantically. For example, thedisjoint axiom is defined for classes belonging to the samegeneration level to restrict individuals’ behaviors such thatSOS:LogicalSensor and SOS:PhysicalSensor are disjointed toensure that an individual can be an instance of any one ofthese classes but not both at the same time.

3.4. SmartOntoSensor Properties and Restrictions. In OWL,properties represent relationships among classes. A propertycan be either object property or datatype property, eitherof which represents link(s) between instance(s) of domainwith instance(s) of range. For an object property, both thedomain and range should be instances of concepts, whereasfor a datatype property domain should be an instance of aconcept and range should be an instance of a datatype. Likeconcepts, exhaustive lists of object anddatatype properties areidentified from sources for using in SmartOntoSensor.

SmartOntoSensor contains an extended list of objectproperties (i.e., 382), some of which are coming fromthe imported sources and others are explicitly created.As several of the SmartOntoSensor concepts are derivedfrom the concepts in the imported ontologies, therefore, theirobject properties are also inherited in the same fashion andmade themmore specialized for SmartOntoSensor concepts.The SmartOntoSensor object properties SOS:hasWiFi, SOS:hasBluetooth, SOS:hasCPU, SOS:hasRAM, and SOS:hasDis-play are made as subproperties of the SSN:hasSubSystemfor linking SOS:Smartphone with individual resourceclasses. Another SmartOntoSensor object property SOS:hasSmartDeployment is made as subproperty of SSN:hasDeployment to define relationship between SOS:Smartphone and SOS:SmartPhoneDeployment. It is dueto the fact that a smartphone deployment has uniquefeatures and methods compared to the objects deploymentsin wireless sensor networks. As several concepts of theimported ontologies are used directly, therefore, theirobject properties are used similarly to avoid any confusion.The object properties SSN:hasInput and SSN:hasOutPutare used to represent input and output, respectively, ofSSN:Sensing. Similarly, the object property SSN:hasValue isused to represent the relationship between SSN:SensorOuputand SSN:ObservationValue. Furthermore, like concepts, abundle of domain-specific object properties are identifiedfor SmartOntoSensor from the additional sources to relateconcepts in the ontology in a more meaningful and subtleways such as SOS:recognizeCotnext, SOS:hasProcessType,SOS:organizedBy, SOS:deriveFrom, SOS:constructedFrom,SOS:hasLatency, SOS:hasFrequency, and SOS:hasSpace.Table 4 represents an excerpt of the SmartOntoSensor objectproperties along with their domains and ranges.

Apart from the object properties, SmartOntoSensorcontains an extensive list of 136 datatype properties,

which are identified and mapped to give comprehensiveinformation about concepts such as SOS:hasMemorySize,SOS:hasActiveSensing, SOS:modelScientificName, SOS:lati-tude, SOS:eventCancelled, SOS:isConsumable, SOS:manu-facturerEmail, SOS:hasBatteryModel, SOS:minValue, SOS:value, SOS:hasMaxRFPower, and SOS:nickName. Table 5presents an excerpt of the SOS datatype properties alongwith their domains and ranges.

Classes in SmartOntoSensor are refined by using objectproperties and datatype properties to superimpose con-straints and axioms for describing their individuals. Anexample of such constraints is the property restrictions (i.e.,quantifier restrictions, cardinality restrictions, and hasValuerestrictions) for describing number of occurrences and valuesof a property essential for an individual to be an instanceof a class. For example, for an individual to be an instanceof SOS:Smartphone class, it is essential for the individual tohave at least one occurrence of SOS:hasCPU object propertyfor relating the individual with an instance of the SOS:CPUclass. Tables 4 and 5 also show excerpts of the propertyrestrictions for the SmartOntoSensor object properties anddatatype properties, respectively.

4. Evaluation and Discussion

Several approaches have been proposed and used for eval-uating ontologies from the perspectives of their quality,correctness, and potential utility in applications. The fourmain methods are gold-standard comparison, application-based evaluation, data sources comparison, and human-centric evaluation [29–31]. The gold-standard method advo-cates on comparison measurements of a well-formed datasetproduced by a given ontology against other datasets. Theapplication-based evaluation method evaluates an ontologyusing the outcomes of an application that is employing theontology.The data sources comparisonmethod describes theuse of a repository of documents in a domain where ontologyis expected to cover the domain and its associated knowledge.The human-centric evaluation method emphasizes humanefforts where an expert would assess the quality of a givenontology by comparing it to a defined set of criteria. Sim-ilarly, several metrics have been defined for verifying andvalidating ontologies, where verification determines whetherontology is built correctly and validation is concerned withbuilding the correct ontology [32]. A detailed discussionabout ontology evaluation methods, metrics, and approachesis beyond the scope of this paper; however, it can be found in[31]. For evaluating SmartOntoSensor, we have used the first,second, and fourth methods along with comparison usingmultimetrics approach, accuracy checking, and consistencychecking, which are discussed in detail in the subsequentsections. A top-level terminological requirements fulfillment(by providing concepts) comparison of SmartOntoSensorand other sensors and sensor networks ontologies is shown inTable 6. A tick in the table indicates the capability of ontologyto describe the stated aspect in some form and absence of atick indicates either the absence or insufficient information ofthe aspects.

Page 13: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 13

Table 4: An excerpt of SmartOntoSensor object properties.

Domain Object property Range Quantifier CardinalitySOS:Smartphone SOS:hasCPU SOS:CPU Exis. & univ. Min 1SOS:Smartphone SOS:hasOperatingSystem SOS:OperatingSystem Existential Min 1SOS:Smartphone SOS:isPlacedOn SOS:Halmet | SOS:SelfiStick Universal Max 1SSN:Sensor SOS:measure SOS:QualityType Exis. & univ. Min 1SSN:Sensor SOS:produces SSN:“Sensor Output” Existential Exactly 1SOS:PhysicalSensor SOS:hasTypeOfOperation SOS:TypeOfOperation Existential Exactly 1SOS:PhysicalSensor SOS:calibration SOS:calibration Universal NilSOS:LogicalSensor SOS:constructedFrom SOS:PhysicalSensor Exis. & univ. Min 1SSN:“Sensor Output” SOS:relatedOutput SSN:“Sensor Output” Universal NilSSN:“Sensor Output” SOS:withAccuracy SSN:Accuracy Universal NilSOS:Sensor SOS:hasObservation SSN:Observation Universal Min 1SOS:QuatityValue SOS:hasUnit SSN:“Unit of Mesasure” Existential Exactly 1SOS:Metadata SOS:hasManufacturer SOS:Manufacturer Universal Min 1SOS:Metadata SOS:hasVersion SOS:Version Universal Max 1SSN:Sensor SOS:hasEnvironmentCondition CXT:EnvironmentalCondition Universal NilSSN:Process SOS:hasProcessType SOS:ProcessType Existential Max 1SSN:Process SOS:hasSubProcess SSN:Process Universal NilSSN:Sensing SOS:hasControlStructure SOS:ControlStructure Universal Min 1SSN:Sensor SOS:recognizeContext SOSS:Context Universal Min 1SSN:FeatureOfInterest SOS:isContainedIn SSN:Observation Exis. & univ. Min 1SSN:“Observation Value” SOS:identifyContext SOS:Context Exis. & univ. Min 1SOS:Context SOS:hasTheme SOS:Theme Exis. & univ. Min 1CXT:User l SSN:Person SOS:hasPreferenceProfile SOS:PreferenceProfile Exis. & univ. NilSOS:Event SOS:attende CXT:User l SSN:Person Exis. & univ. Min 1CXT:Location SOS:hasFeatures geonames:GeonamesFeatures Universal NilCXT:Activity SOS:hasActivityLocation CXT:Location Exis. & univ. Min 1

4.1. Gold-Standard-Based Evaluation. SmartOntoSensor isthe first attempt to ontologically model smartphone sensorydata and its context mapping. Therefore, no counterpartexists in the domain for comparison with SmartOntoSensor.However, a number of ontologies have been developed forsensors and sensor networks that include SSN [8], CSIRO [7],OntoSensor [5, 10], and CESN [12], which are mostly quotedin the literature. Therefore, SmartOntoSensor is comparedwith them to provide insights into its quality. Metrics andautomated tools are defined for evaluating quality of ontologyin some recent works such as OntoQA [33, 34]. OntoQAis feature-based ontology quality evaluation and analysistool that has the capabilities of evaluating ontology at bothschema and knowledge base levels. OntoQA uses a set ofmetrics to evaluate quality of ontology from different aspectsincluding number of classes and properties, relationshipsrichness, attributes richness, and inheritance richness. Toevaluate the SmartOntoSensor quality, OntoQA is used.However, the evaluation and comparison are limited toschema level only because of the unavailability of knowledgebases of the competing ontologies. The overall comparisonof SmartOntoSensor with the competing ontologies usingOntoQA schema metrics is shown in Table 7.

Using the information provided in [33, 34], the inter-pretation of results using schema metrics indicates that

SmartOntoSensor has improved ontology design with richpotential for knowledge representation as compared to theexisting ontologies. SmartOntoSensor provides enhancedcoverage of its broader modeling domain by having largernumber of classes and relationships in comparison to existingontologies.This also indicates that SmartOntoSensor is morecomplete by appropriately covering the problem domain byproviding answers to almost any of the ontology domain-related questions. OntoSensor also has shown tremendousclasses and relationships measurements; however, OntoSen-sor mainly describes the spectrum of sensors concepts(i.e., hierarchy of sensors classes and subclasses) and data.SmartOntoSensor, on the other hand, includes an extensivelist of classes and relationships for describing broad aspectsof smartphone systems, sensors, observation measurements,and context applications. The highest relationship richnessvalue shows that SmartOntoSensor has diversity in typesof relations in the ontology. Instead of relying only oninheritance relationships (usually conveying less informa-tion), SmartOntoSensor contains diverse set of relationshipsto convey almost complete information about the domain.The slight richness in relationships of SmartOntoSensor overCSIRO can be microscopically viewed large due to largenumber of SmartOntoSensor classes as compared to CSIRO.The increased number of relationships and relationship

Page 14: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

14 Journal of Sensors

Table 5: An excerpt of SmartOntoSensor datatype properties.

Domain Datatype property Range Quantifier CardinalitySOS:Manufacturer SOS:manufacturerName String Exist. & univ. Min 1SOS:SerialNumber SOS:serialNumber String Existential Exactly 1SOS:Size SOS:weight Float Universal Max 1SSN:“Sensor Output” SOS:isValid Boolean Existential Exactly 1SOS:Smartphone SOS:hasIMEI String Existential Exactly 1SOS:Smartphone SOS:isConsumable Boolean Universal Exactly 1SOS:MemoryResource SOS:hasMemorySize String Existential Exactly 1SOS:PowerSupplyResource SOS:hasPowerCapacity String Existential Exactly 1SOS:Version SOS:versionNumber String Existential Exactly 1SOS:Offset SOS:accXAxis Float Existential Exactly 1SOS:SpaceTuple SOS:latitude Float Existential Exactly 1SOS:Orientation SOS:yaw Float Existential Exactly 1SOS:PersoanlProfile SOS:homePage String Universal NilSOS:ContactProfile SOS:phone String Universal Max 1ontology:Location SOS:officialName String Exist. & univ. Min 1SOS:QualityValue SOS:value Float Existential Exactly 1SSN:“Unit of measure” SOS:unit String Existential Exactly 1SOS:DataContainer SOS:path String Existential Exactly 1Ontology:Humidity SOS:hasHumidityIntensity String Existential Exactly 1ontology:Lighting SOS:hasLightIntensity String Existential Exactly 1SOS:SmartphoneMetadata SOS:hasDescription String Universal NilSOS:DataContainer SOS:hasName String Existential Exactly 1SOS:ContactProfile SOS:mobile PositiveInteger Universal Nil

richness also advocate high cohesiveness of SmartOntoSen-sor by strongly and intensively relating classes in the ontology.The lowest inheritance richness value proves SmartOntoSen-sor as a vertical ontology, which is covering the domainin a detailed manner. In other words, SmartOntoSensoris concise by not having irrelevant concepts or redundantrepresentations of the semantics regarding the domain. Thelarge number of classes and relationships in SmartOntoSen-sor makes its slight difference in inheritance richness muchbigger in comparison with other available ontologies. Thelowest tree balance value indicates that SmartOntoSensor canbe more viewed as a tree compared to others. The highestclass richness value of SmartOntoSensor indicates increaseddistribution of instances across classes by allowing knowledgebase to utilize and represent most of the knowledge in theschema. In other words,most of the SmartOntoSensor classeswould be populated with instances as compared to the other.Comparatively, SmartOntoSensor is ranked the highest dueto its larger schema size in terms of number of classes andrelationships. OntoQA cannot directly calculate couplingmeasure of ontology. However, by using the coupling defi-nition [31], SmartOntoSensor also shows high coupling byreferencing an increasednumber of classes from the importedontologies.

4.2. Multicriteria Approach Based Evaluation. To evaluatesemantics and understandability of terms used in ontology,researchers have not established any common consensus onwidely acceptedmethods (i.e., tools andmetrics) in computer

science (i.e., it could be due to relatively being a new field)so far. However, to evaluate ontology terminologies alongwith its popularity, an approach comprising several decisioncriteria or attributes is defined by [35] where numericalscores can be assigned to each criterion. The overall scoreis calculated by the weighted sum of its per-criterion scores[29]. An objective formula for terminology evaluation andpopularity measurement (shown in (1)) has been used from[35] that is based on the objective multicriteria matricesdefined in [36].

Objective = 𝐼 ∗ 𝑤𝑖 + 𝐶 ∗ 𝑤𝑐 + 𝑂 ∗ 𝑤𝑜 + 𝑃 ∗ 𝑤𝑝. (1)

In (1), 𝐼 shows interoperability and is 𝐼 = 𝑁/𝑇, where 𝑇is total number of terms in ontology and 𝑁 is the numberof terms findable in WordNet. 𝐶 is clarity and is 𝐶 =(∑ 1/𝐴 𝑖)/𝑁, where 𝐴 𝑖 shows the number of meanings ofevery interoperable term in WordNet, then clarity of eachterm is 1/𝐴 𝑖, 𝑂 represents comprehensiveness, and is 𝑂 =𝑇/𝑀, where 𝑀 is the number of terms in standard termset in the domain that the ontology belongs to. 𝑃 denotespopularity and is 𝑃 = 𝐸/𝐻, where 𝐸 denotes the number ofaccess instances of the ontology and 𝐻 is the total numberof access instances of all the ontologies in the same domain.The weights 𝑤𝑖, 𝑤𝑐, 𝑤𝑜, and 𝑤𝑝 represent the weight ofinteroperability, clarity, comprehensiveness, and popularity,respectively, with condition that these weights satisfy theequation 𝑤𝑖 + 𝑤𝑐 + 𝑤𝑜 + 𝑤𝑝 = 1. WordNet is a large lexicaldatabase of English words that is linking words by semanticrelationships and has been used by several of the ontology

Page 15: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 15Ta

ble6:To

p-levelterminologicalrequ

irementsfulfillm

ent(usingavailabilityof

concepts)

comparis

onof

SmartO

ntoSensora

ndothersensorsandsensor

networks

ontologies.

Term

inological

requ

irementscategory

Top-levelcon

cepts

Ontologies

Avanchae

tal.

[11]

OntoSensor

[5,10]

CESN

[12]

Eidetal.

[13]

CSIRO[7]

SSN[8]

ISTA

R[14

]Kim

etal.

[15]

SmartO

ntoSensor

Base

term

s

Syste

m✓

✓✓

Sensor

✓✓

✓✓

✓✓

✓✓

✓Com

ponents/resources

✓✓

✓Process

✓✓

✓✓

Con

text

Syste

mterm

s

Platform

Hardw

are

✓✓

✓✓

✓Softw

are

✓Deploym

ent

✓✓

✓✓

✓Syste

mmetadata

Identifi

catio

n✓

Manufacturer

✓Com

ponents/resources

Power

supp

ly✓

✓✓

✓CP

U✓

✓Mem

ory

✓✓

Networking

✓✓

Sensor

term

s

Sensor

hierarchy

Physicalsensor

✓✓

✓✓

✓Lo

gicalsensor

✓Ty

peof

operation

Activ

esensor

✓✓

✓Passives

ensor

✓✓

✓Operatin

gcond

ition

✓✓

✓✓

History

✓Sensor

metadata

Identifi

catio

n✓

✓✓

✓Manufacturer

✓✓

✓✓

✓Con

figuration

✓✓

✓Sensor

process

Sensing&processtype

✓✓

✓✓

✓✓

Processp

aram

eters

✓✓

✓✓

✓✓

Measurementp

roperties

Frequency

✓✓

✓✓

Latency

✓✓

✓Ac

curacy

✓✓

✓✓

✓✓

Resolutio

n✓

✓✓

✓Measurementrange

✓✓

✓✓

Precision

✓✓

Power

consum

ption

✓✓

Sensitivity

✓✓

Page 16: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

16 Journal of Sensors

Table6:Con

tinued.

Term

inological

requ

irementscategory

Top-levelcon

cepts

Ontologies

Avanchae

tal.

[11]

OntoSensor

[5,10]

CESN

[12]

Eidetal.

[13]

CSIRO[7]

SSN[8]

ISTA

R[14

]Kim

etal.

[15]

SmartO

ntoSensor

Observatio

nterm

sData/ob

servation

✓✓

✓✓

✓✓

✓✓

Respon

semod

el✓

✓✓

✓Observatio

ncond

ition

✓✓

Dom

ainterm

sUnito

fmeasurement

✓✓

✓✓

✓Feature/qu

ality

✓✓

✓✓

✓✓

✓✓

Sampled

medium

✓✓

✓✓

Con

text

term

s

Locatio

n✓

✓✓

✓✓

✓✓

Time

✓✓

✓Ac

tivity

✓Ev

ent

✓User

✓Service

✓Storageterms

File

✓Fo

lder

Page 17: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 17

Table 7: Statistics of SmartOntoSensor and other sensors and sensor networks ontologies using OntoQA.

Ontology Classes Relationships Relationships richness Inheritance richness Tree balance Class richness RankSSN [8] 47 52 59.09 2.4 1.76 66.18 IVOntoSensor [5, 10] 286 219 46.39 2.63 1.66 59.37 IICESN [12] 35 18 39.13 2.54 1.53 71.01 VCSIRO [7] 70 70 65.42 2.84 1.36 39.65 IIISOS 259 382 65.52 2.23 1.19 84.67 I

Sensor ontology

OntoSensor CESN CSIRO SCO SOS0

0.05

0.150.2

0.1

0.250.3

0.350.4

0.45

Eval

uatio

n re

sult

Figure 6: Objective analysis of SmartOntoSensor using multicrite-ria approach.

development researches for terminology definition, analysis,and filtering such as [34]. Using the guideline from [34],WordNet is used for the evaluation of terms interoperability,clarity, and comprehensiveness, and literature citation indexis used for evaluation of popularity. During analysis, multi-words terms in SmartOntoSensor are broken into individualwords and stemmed to receive more accurate metrics values.Results obtained by [35] using the same formula are extendedby the inclusion of SmartOntoSensor results and shown inFigure 6.

Analyzing results indicates that SmartOntoSensor hasacceptable levels of terms interoperability and comprehen-siveness, medium clarity, and lowest popularity. Overall,SmartOntoSensor shows reasonable analysis results, signi-fying superiority over the others due to rich set of terms(i.e., classes and properties) but slightly less than OntoSensordue to lowest popularity value. Certainly, care has beentaken while calculating the results, but the approach involveshuman efforts (due to unavailability of automated tools),where the chances of errors in the process cannot be ignored.

4.3. Accuracy Checking. To demonstrate the correctness andutility of SmartOntoSensor, accuracy checking is performedto determine that the asserted knowledge in the ontologyagrees with the experts’ knowledge in the domain. Ontologywith correct definitions and descriptions of classes, proper-ties, and individuals will result into high accuracy [31]. Recalland precision rates are the two primary InformationRetrieval(IR) measures that are used for evaluating the accuracy ofontology [31]. Recall and precision rates are defined andshown in (2) and (3), respectively [13], and SmartOntoSensor

is required to maximize both recall and precision rates for itsacceptability.

recall rate = (number of relevent items retrievedtotal number of relevant items

) , (2)

precision rate

= (number of relevent items retrievedtotal number of items retrieved

) . (3)

Accuracy of SmartOntoSensor is determined by comput-ing the recall and precision rates for the functional require-ments (represented in competency questions) by executingthe SPARQL queries on the SmartOntoSensor knowledgebase. To form knowledge base, SmartOntoSensor is instan-tiated with relevant information from USC Human Activ-ity Dataset (USC-HAD) (http://www-scf.usc.edu/∼mizhang/http://www-scf.usc.edu/∼mizhang/), manuals, reports, anddocumentations using Protege 4.3. Based on these instances,SPARQL query language is used through Protege SPARQLQuery plug-in to query knowledge base for retrieving rel-evant results. The scenarios (built using competency ques-tions) used for proof-of-concept assume utilizing low-levelsensory data of heterogeneous sensors for mapping them tohigh-level queries. For example, a possible source of detectingand monitoring signatures of human fall is tracking vectorforces exerted during fall and location changes. Therefore,mapping fall concept to a concept which could be determinedby accelerometer, gyroscope, and GPS sensors through rela-tionship SOS:isDetectedBy can enhance human fall detec-tion. The scenarios used for proof-of-concept include (1)using microphone and GPS low-level sensory data to searchlocations having high noise pollutions and (2) using low-level sensory data to detect a user’s context and automaticallyinitiate a respective service (i.e., application). The actualrunning SPARQL queries for each of the scenarios are shownin Algorithms 1 and 2, respectively.

Using assistance of low-level sensory data, the testingqueries resulted in acceptable precision and recall rates byretrieving relevant data (i.e., all of the locations having greatersound levels and all of the sensors having measurementcapabilities of acceptable levels). Therefore, it has beenobserved that SmartOntoSensor is of potential effectivenessof integrating heterogeneous sensory data to answer high-level queries.

4.4. Consistency Checking. OWL-DL is used as the knowl-edge representation language for SmartOntoSensor content.

Page 18: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

18 Journal of Sensors

PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>PREFIX owl: <http://www.w3.org/2002/07/owl#>PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>PREFIX sos: <http://www.semanticweb.org/shoonikhan/ontologies/2015/9/SmartOntoSensor#>SELECT ?name ?latitude ?longitude

WHERE { ?mic sos:hasMicrophoneValue ?output.?output sos:hasSoundOutput ?soundvalue.?output sos:relatedOutput ?GPSoutput.?GPSoutput sos:latitude ?latitude.?GPSoutput sos:longitude ?longitude.?GPSoutput sos:isCoordinatesOf ?location.?location sos:officialName ?name.FILTER (?soundvalue > "65.0"∧∧xsd:float)

}

Algorithm 1: SPARQL query for retrieving locations having noise intensity greater than 65 dB.

PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>PREFIX owl: <http://www.w3.org/2002/07/owl#>PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>PREFIX ssn: <http://purl.oclc.org/NET/ssnx/ssn#>PREFIX cxt: <http://a.com/ontology#>PREFIX sos: <http://www.semanticweb.org/shoonikhan/ontologies/2015/9/SmartOntoSensor#>SELECT ?context ?service

WHERE { {?acc sos:hasAccelerometerValue ?offset.?offset sos:accXAxis ?xaxis.?offset sos:accYAxis ?yaxis.?offset sos:accZAxis ?zaxis.FILTER (((?xaxis >= "-10.9"∧∧xsd:float) && (?xaxis <= "0.4"∧∧xsd:float)) &&((?yaxis >= "-0.5"∧∧xsd:float) && (?yaxis <= "0.6"∧∧xsd:float)) &&((?zaxis >= "-15.0"∧∧xsd:float) && (?zaxis <= "18.0"∧∧xsd:float))).?offset ssn:hasValue ?obsvalue.?obsvalue sos:hasObservationLocation ?obsloc.?obsvalue sos:identifyContext ?context.?context sos:hasActivityLocation ?cxtloc.?cxtloc sos:hasCoordinates ?activityloc.}FILTER (?obsloc = ?activityloc).?context sos:startService ?service.

}

Algorithm 2: SPARQL query for detecting a context and service using low level sensory data.

A significant feature of the ontologies described using OWL-DL is that they can be processed using reasoners [13, 37].Using the descriptions (conditions) of classes, a reasoner candetermine whether a class can have an instance or not. Usingthe methodology proposed by [36] to validate consistencyof ontology, SmartOntoSensor is passed through two majortests: (1) subsumption test to check whether a class is asubclass of another class or not and (2) logical consistencycheck to see whether a class can have any instance or not.Fact++ 1.6.2 and RacerPro 2.0 reasoners are used becauseof their strong reasoning capabilities and interoperability

with Protege [13]. Both of the reasoners are fed with man-ually created class hierarchy (called asserted ontology) toautomatically compute an inferred class hierarchy (calledinferred ontology) by using the descriptions of classesand relationships. The inferred ontology is compared withasserted ontology and found that both of the class hierarchiesmatch and none of the classes is inconsistent. However, theclasses are classified and repositioned by the reasoners incase of either having many super-classes or being subjectedto some logical constraints. Therefore, both of the tests aresignificant and SmartOntoSensor is logically consistent and

Page 19: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 19

Figure 7: SmartOntoSensor asserted and inferred class hierarchies after using reasoner.

valid. Figure 7 depicts a snippet of the SmartOntoSensorasserted and inferred class hierarchies after reasoning.

4.5. Application-Based Evaluation. Ontologies can be pluggedinto applications, which would largely influence outputs ofan application for a given task. Therefore, ontologies can besimply evaluated by analyzing results of using applications. Inthe coming sections, we have demonstrated a general archi-tecture for context-aware applications that uses SmartOn-toSensor and the development of a real-world smartphonecontext-aware application using the architecture.

4.5.1. Application Architecture. To demonstrate the feasibilityand verifiability of SmartOntoSensor in real-world context-aware applications, we have developed a multilayer archi-tecture shown in Figure 8. The architecture comprises fourlayers: sensors layer, information layer, semantic layer, andapplication layer, respectively, which are concisely describedin the following headings.

(1) Sensors Layer. The sensors layer comprises smartphonesensors (i.e., physical and logical) that would provide notonly raw sensory data from environment but also additionalrelated information for effective context recognition such asBlutoothID of a nearby object to determine location andproximity of a user.

(2) Information Layer. The information layer extracts andreceives raw sensory data and processes it into informa-tion. Information layer consists of two sublayers: collectionengine and data aggregation and association. The collectionengine acts as an interface between the sensors layer anddata aggregation and association layer and contains severalcomponents. Sensor configuration defines sensors manage-ment activities (i.e., defining reading rate, etc.) and sensoracquisitor defines methods of reading data from sensors suchas event-based or polling-based and from single sensor ormultiple sensors in parallel. Sensors data processing extractsmeaningful features from the sensors streams and usescapturing engine to store the sensory and other contextualinformation in a local temporary storage. The layer can

also implement machine learning techniques to extract fine-grained contextual information out of sensory information.Data is temporarily stored locally to define a complete setof data about an event. The data aggregator and associationcollects data about a context from storage and establishesrelationships with other cooccurring contexts. This layerclusters the data into context and subcontext groups andalso extracts data from other sources (e.g., calendar entry fornaming a context) to portray complete picture of a context.The complete set of data is converted into an exchangeableformat (e.g., JSON) for communication to other layers.

(3) Semantic Layer. The semantic layer adds semantic glue tothe architecture by mapping low-level context features intohigh-level semantic model (i.e., ontology). The semantic rulemapping and extension sublayer implements a web serviceinterface to receive and extract information fromwell-formedmessage (i.e., JSON). The sublayer uses direct approach todefine mapping and semantic rules to instantiate individualsin the SmartOntoSensor ontology and annotate them withtheir datatype and object properties and define relation-ships between individuals within the ontology, respectively.The direct approach is favorable because of its simplicity,whereas generic approach is complex requiring multiplicityof resources (i.e., vocabularies, classification algorithms, anddomain-specific ontologies). However, the architecture sup-ports generic approach by providing the required resourceseither at the information layer or at semantic layer. In case ofdirect approach, new rules are needed to be defined for newconcepts. Rules are defined in event-condition-action format.For example, see Algorithm 3.

The semantic engine framework denotes a SemanticWeb framework (e.g., AndroJena for Android-based smart-phones) that provides the capabilities of hosting ontolo-gies (e.g., SmartOntoSensor) and inferencing and reason-ing mechanism to detect inconsistency in ontologies anddeduces new knowledge on the basis of existing ones,features for updating ontologies with new individuals andannotations and extending ontologies by defining new classesand properties, triple store for storing data in RDF format,

Page 20: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

20 Journal of Sensors

ontology

Logical sensors

Data aggregation and association

Semantic rules mapping and extension

Dat

aIn

form

atio

nKn

owle

dge

Und

erst

andi

ng

Acqu

ireTr

ansfo

rmPr

ovid

e

Sens

ors l

ayer

Info

rmat

ion

laye

rSe

man

tic la

yer

Appl

icat

ion

laye

r

Context classifier

Physical sensors

Sensors configuration

Sensors catalog

Sensors data acquisitor

Sensors data processingLocaldata

storage

Collection engine

Sparql query engine

RDF triple store

Ontology parsing

Smartphone sensorsOntology

update and extension

Reasoning and inferencing

Semantic engine framework

Indexer

Capturing engine

Figure 8: Architecture for smartphone context-aware applications using SmartOntoSensor.

IF EXIST event/sensors/GPS THEN{

CREATE INDIVIDUAL OF CLASS GPS INSMARTONTOSENSORWITH NAME event/sensors/GPS/@locationnameIF EXIST event/sensors/GPS/@LatitudeValue THEN{

SET DATATYPE PROPERTY hasLatitude TOevent/sensors/GPS/@LatitudeValue

}IF EXIST event/sensors/GPS/@LongnitudeValue THEN{

SET DATATYPE PROPERTY hasLongnitude TOevent/sensors/GPS/@LongnitudeValue

}}

Algorithm 3: Example of event-condition-action format.

Page 21: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 21

Table 8: Hypothetical exemplary contexts and their corresponding modes values.

Context Audio volume Screen brightness Font size Vibration Write diskLocation:House High Normal Normal No Phone storageLocation:Office Normal Normal Normal Yes SD cardLocation:Meeting Silent Bright Normal Yes SD cardActivity:Sitting Low Low Small No Phone storageActivity:Walking Normal Normal Normal Yes Phone storageActivity:Running High Bright Extra large Yes Phone storage

Figure 9: Changing modes by ModeChanger using contextual information from SmartOntoSensor.

and SPARQL query engine for dealing with users’ initiatedSPARQL queries.

(4) Application Layer. The application layer represents thecontext-aware applications, which can exploit the full poten-tial of the architecture in general and SmartOntoSensorin specific to automatically recognize context and initiateservices accordingly. The architecture provides a compositeframework, which can be used by a wide variety of developersfor developing any type of context-aware applications. Aswith the passage of time lifestyles changes and new typesof data sources (i.e., sensors) can emerge, new mapping andsemantic rules are needed to be defined accordingly.Mappingand semantic rules applications can be provided as either aseparate application or an integral part of a context-awareapplication.

4.5.2. ModeChanger Application. Using the architectureshown in Figure 8, we have developed a prototype appli-cation called ModeChanger. A smartphone mode definesthe number and nature of features, resources, and servicesavailable to consume at a time instant. Modern smart-phones’ operating systems support a number of operationmodes that can be explicitly adjusted by users to definea smartphone’s behavior according to a context. Currently,the available modes are airplane/flight mode, normal mode,

audio mode, and brightness mode. The context-dependentmode changing can benefit frommanipulating and accessingcontext information. The application can ease human-phoneinteraction and adjusts smartphone modes using contextualinformation.The prototype application is aimed for Android-based smartphones running with Ice Cream Sandwich 4.0.3or higher and is developed in Java programming languageusing Android SDK tools revision 22.6.3 with Eclipse IDEand SensorSimulator-2.0-RC1 running on a desktopmachine.The target code is deployed and tested on Samsung GalaxySIII running with Android Jelly Bean 4.1.1 operating system.The application runs inconspicuously in the background asa service and utilizes SmartOntoSensor to deduce contextualinformation using low-level sensory data and automaticallyadjusts modes by invoking low-level services. The fuzzy logiccontroller is used to direct the overall adjustment processaccording to contexts. To adjust a smartphone’s behavior, theapplication adjusts audio volume, screen brightness, font size,vibration, and default write disk according to the identifiedcontext. Table 8 lists a few hypothetical exemplary contextsand their corresponding modes values. The application istested closely and found that the application can successfullydifferentiate between different user contexts in real-time bymapping low-level sensory data into high-level contexts andtrigger services accordingly. Figure 9 presents screen shots

Page 22: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

22 Journal of Sensors

of ModeChanger changing modes according to changingcontexts.

4.6. Experimental Method Base Evaluation. Using the guide-lines of [38], an empirical study is designed to evaluatethe effectiveness of SmartOntoSensor. The empirical studyis composed of experiments for evaluating SmartOntoSen-sor from the aspects: (1) requirements coverage, (2) goalsand design characteristics, (3) explicitness and usability,(4) reusability, (5) stability, (6) modeling mistakes, and (7)ModeChanger application. The data for the empirical studyis collected from the participants using a questionnaire. Thetotal number of participants in the study is 17 having substan-tial experience and knowledge about ontology developmentand evaluation, knowledge of OWL, and problem domain.The participants are voluntarily selected from the master(M.S.) and Ph.D. students specializing in the areas of WebSemantics and Wireless Sensor Networks in the Departmentof Computer Science, University of Peshawar. However, todo justice with the study, one-day workshop on ontologydevelopment in general and SmartOntoSensor developmentin specific was arranged for the participants. Furthermore,the participants were provided the SmartOntoSensor sourcecode and ModeChanger application was installed on theirsmartphones for observations, experiments, and practicalusage for five days. At the end of the time, each participant(individually) filled out a questionnaire composed of 35questions that are comprehensive enough to analyze Smar-tOntoSensor from the abovementioned aspects. Questionsin the questionnaire are propositions whose answers areselected from a 5-level Likert-Scale, ranging from “stronglydisagree” to “strongly agree.” Table 9 represents statisticalinformation of the participants’ responses to the questions inpercentage. About 71.4% of the participants (17.8% stronglyagree and 53.6% agree) responses agreed in effectivenessof SmartOntoSensor and showed their confidence, 13.5% ofthe participants (3.9% strongly disagree and 9.6% disagree)responses did not, and 15.1% of the participants’ responsesremain neutral.

The Likert-Scale data is ordinal, where the order of thevalues is significant and important but the exact differencebetween the values is not really known. To analyze theordered scale 5-level Likert-Scale responses data using Chi-Square descriptive statistics, the five response categories (i.e.,strongly disagree, disagree, neutral, agree, and strongly agree)are broken down into two nominal categories (i.e., disagreeand agree) by combining the lower level three categoriesand upper level two categories, respectively. Chi-Square isimportant statistic for analysis of categorical data. Table 10represents the division of five-level Likert-Scale responsecategories into two nominal categories, percentage valuesin the nominal categories, and in the total. Chi-Square testis executed on the nominal categories using SPSSS 16.0 toshow the effectiveness of SmartOntoSensor. The null andalternative hypotheses are, respectively, as follows:(H0) SmartOntoSensor is not an effective ontology for

smartphone-based context-aware computing.(H1) SmartOntoSensor is an effective ontology for

smartphone-based context-aware computing.

Results of the Chi-Square test are shown in Table 11. Thetop row in the table shows Pearson Chi-Square statistics 𝜒2 =5.950𝐸2 and 𝑝 < 0.001. The null hypothesis (H0) is rejected,since 𝑝 < 0.05 (i.e., in fact 𝑝 < 0.001). Therefore, thealternative hypothesis (H1) stands true, which signifies thatSmartOntoSensor is an effective ontology for smartphone-based context-aware computing.

5. Applications of SmartOntoSensor

SmartOntoSensor is an attempt to provide semantic modelby associatingmetadata with smartphone and sensory data tobe used in potential context-aware smartphone applications.It has a broad spectrum of applications and can find placeanywherewhere smartphone sensors are employed.However,a few of the broad application areas of SmartOntoSensorcould be as follows.

5.1. Linked Open Data. The Linked Open Data (LOD) usesSemantic Web technologies to provide an infrastructurefor publishing structured data on the web regarding anydomain in such a way that is formal and explicit and canbe linked to or linked from external datasets [39] to increaseits usefulness. Information captured in SmartOntoSensor canbe linked with data sources on LOD including GeoData andDBpedia, and Internet of Thing (IoT) objects, which arediscoverable and accessible through LOD cloud. The Smar-tOntoSensor concepts can be automatically annotated withinformation from other spatially and thematically relatedsensors to provide a scalable and semantically rich datamodel.This way information can be integrated from differentcommunities and sources for a number of reasons includ-ing drawing conclusions, creating business intelligence, andautomated decision making. The large-scale data generatedby SmartOntoSensor and its linkage with LOD will resultinto Linked Big Sensors data providing a novel platform forpublishing and consuming smartphone sensors data, whichcan be queried, retrieved, analyzed, reasoned, and inferred forsolving real-world problems [39].

5.2. Lifelogging. Lifelogging is a type of pervasive computingthat uses digital sensors to capture and archive a unifieddigital record of peoples’ lifetime experiences in multimediaformat for augmenting human memory. Researchers haveshown smartphones as an ideal platform for potential lifelog-ging systems due to its technological advancements, sensingcapabilities, and being a constant companion of users [40].Lifelogging systems can use SmartOntoSensor for semanticannotation of captured lifelog information together withsensory information and contexts derived from low-levelsensory data. SmartOntoSensor can allow for semantic-basedreasoning on the captured lifelog data and their metadatafor deducing new semantics and relationships among lifelogobjects. Furthermore, the ontology can enhance retrievalof lifelog information for augmenting memory in real-timeby allowing users to concisely express their queries andobtain precise answers using semantics of the contained dataand queries. Lifelog information can be further enriched byexploiting the potential of SmartOntoSensor for linking andextracting data from data sources in the LOD cloud.

Page 23: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 23

Table 9: Participants responses to the questions in questionnaire.

Questions ∗ sample Likert-Scale options Frequency Percent Valid percent Cumulative percent

35 ∗ 17

Strongly disagree 23 3.9 3.9 3.9Disagree 57 9.6 9.6 13.4Neutral 90 15.1 15.1 28.6Agree 319 53.6 53.6 82.2

Strongly agree 106 17.8 17.8 100.0Total 595 100.0 100.0

Table 10: Division of 5-level Likert-Scale response categories into nominal categories and their percentage values within nominal categories.

Nominal categories Category wise distribution Five levels of Likert-Scale response categories TotalStrongly disagree Disagree Neutral Agree Strongly agree

Disagree Count 23 57 90 0 0 170% value within disagree 13.5% 33.5% 52.9% .0% .0% 100.0%

Agree Count 0 0 0 319 106 425% value within agree .0% .0% .0% 75.1% 24.9% 100.0%

Total Count 23 57 90 319 106 595% value within total 3.9% 9.6% 15.1% 53.6% 17.8% 100.0%

Table 11: Results of the Chi-Square test using SPSS 16.0.

Descriptive statistics Value df 𝑝 valuePearson Chi-Square 5.950𝐸2a 4 .000Likelihood ratio 711.941 4 .000Linear-by-linear association 425.035 1 .000𝑁 of valid cases 595a0 cells (.0%) have expected count less than 5.Theminimum expected countis 6.57.

5.3. Smart Environment. In a smart environment, smart-phones can serve as smart objects because of their smallubiquitous nature, having sensor capabilities to detect envi-ronment, and communication capabilities to communicatewith other objects through local networks or the Internetunder the Internet of Things (IoT) domain in order tocreate a machine to machine ecosystem [41]. The increasinguse and need for personal smart objects such as indooror outdoor sensors and actuators have created the needto develop applications which a user may use to log andinteract with these devices under the prism of the Internetor IoT. The SmartOntoSensor ontology can be a key forthe automatic smart environment applications. Smartphoneapplications can leverage SmartOntoSensor for recording andmapping sensory data captured from smartphone sensors andsensors in an environment and transferring the captured andinferred data to a web database that could be shared withothers (i.e., friends in a social network, etc.). Smartphonesmart applications can use SmartOntoSensor for inferringusers’ contexts and automatically directing/controlling smartobjects in environment accordingly. Similarly, SmartOn-toSensor can be important resource for Google’s Android atHome [42] that uses IoT for allowing individuals to log intotheir accounts and control their smart objects at home.

5.4. Activity Recognition. Smartphones offer a unique oppor-tunity to sense and recognize human activities from loca-tion, proximity, and communication data [43]. Making asmartphone aware of users’ activities fits into the largerframework of context awareness. Researchers have demon-strated successful applications of statistical machine learningmodels for smartphone-based activity recognition. However,activity recognition systems, despite producing significantaccuracy in recognizing activities, suffer from a numberof problems including application of a small number ofsensors, requiring an extensive amount of training data, andaddressing a small set of coarse-grained activities. Similarly,some activities are hard to recognize due to either using typeand number of sensors, duration of an activity, or statisticalclassification algorithms used [44]. Recognizing a large set ofactivities with minimal processing requirements is essentialfor satisfying the diverse applications of smartphone-basedactivity recognition systems. SmartOntoSensor, by capturingand fusing data obtained from on-board sensors and externalsensors as well as sources, would enable us to infer a fine-grained classification and recognition of not only a largeset of activities but hard-to-recognize activities as well withminimal processing requirements.

5.5. Augmented Reality (AR). AR is the combination/registration and alignment of virtual and physical objectsas well as their interaction in three dimensions in real-time[45]. Smartphones can serve as a charming blend for ARsystems due to their advanced computational capabilities,user interface, and sensors and provides feasibility of inte-grating all of the solutions in one place such as MARA andMIT sixth-sense project [45]. SmartOntoSensor can serveas a bottom layer for helping upper layer AR systems. Datacontained in SOS can be used by AR system for numerouspurposes including registration/authentication, localization,

Page 24: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

24 Journal of Sensors

viewing, indoor/outdoor navigation guidance, context recog-nition, and annotating objects with sensory and contextualinformation.

6. Conclusion

The potential power of smartphone sensing has been realizeddue to widespread adoption of the sensor-enabled smart-phone technologies by people across several demographicsand cultures. Smartphone technological advancements andintegration of high-valued sensors have pushed developmentof smart sensing applications by the research community,academia, and industry for solving real-world problems.However, the integration and utilization of huge amount ofheterogeneous data generated by heterogeneous smartphonesensors need semantic representation as a prerequisite fortheir unambiguous and precise interpretation and advancedanalytical processing. As a solution, ontology can be deemedas a lethal weapon by containing detailed definitions ofconcepts regarding smartphone and sensors as well as theirproperties to solve the integration, processing, interpretation,and interoperability issues. Ontology would assist smart-phone context-aware applications in querying a sensor orgroup of sensors for extracting low-level data for performinghigh-level tasks. Although a number of sensors and sensornetworks ontologies are presented by the researchers, noneof them is complete enough to satisfy the requirements andusage of smartphones context-aware applications. However,some of these ontologies (e.g., SSN) can be reused to developa comprehensive smartphone sensors ontology.

In this paper, the development of ontology-based smart-phones and sensors repository referred to as SmartOntoSen-sor is presented. SmartOntoSensor includes definitions ofconcepts and properties that are partially extended frommore general and comprehensive SSN ontology, partiallyinfluenced from SensorML, and partially identified from therelevant literature. SmartOntoSensor provides descriptionsabout smartphone systems and deployments, sensors andtheir classification, sensors measurement capabilities andproperties, observations as properties of features of interests,inputs and outputs to sensing processes, context identifica-tion using sensors outputs, and invoking services accordingto contexts. To the best of our knowledge this is the firstsuch effort in the area of smartphone sensors ontologiesto unify smartphone, sensors, and contextual informationwith general world knowledge about entities and relations.SmartOntoSensor is developed and evaluated using state-of-the-art technologies and standards to demonstrate its utilityand value. The test results indicate that SmartOntoSensorhas an improved ontological design for semantic modelingof the domain as compared to the other ontologies and ismore complete by providing rich potential for representinginformation about the domain to answer ontology-relatedquestions. Using quality and efficiency of SmartOntoSensor,a number of its potential application areas were also outlined.

However, SmartOntoSensor does not claim feasibility oforthogonal and universally acceptable smartphone sensorsontology but is an attempt to build a pragmatic smart-phone sensors repositorywith supporting rationale and using

currently available tools for enabling the deployment of theontology in a variety of application domains. Despite promis-ing lab test results and proofs-of-concepts, SmartOntoSensorneeds improvements in several aspects to claim its marketplace such as (1) inclusion of more detailed and relevantconcepts and properties to increase its coverage and expres-siveness, (2) heavily instantiating of the concepts with real-world data to thoroughly test its quality and correctness, (3)thorough investigation by the domain experts to indicate anydiscrepancy, redundancy, and ambiguity, and (4) knowledge-based comparison with sensors and sensor networks ontolo-gies in addition to schema-based comparison.

The future work includes the application of SmartOn-toSensor in more complex real-world solutions to checkits efficiency and performance and indicate any potentialextensions and improvements. In addition, most of theorganizations have captured sensors collected informationin their repositories in either structured or unstructured(e.g., GeoLife GPS Trajectories) formats. Therefore, thereis a strong need for systems that can automatically ana-lyze various structured and unstructured data sources andextract relevant concepts and entities to extend and populateSmartOntoSensor. This automatic information extractionprocess would help in further evaluating SmartOntoSensorand designing more effective context-aware applications.

Competing Interests

The authors declare that they have no competing interests.

Acknowledgments

This research work has been undertaken by the first authoras a partial fulfillment of Ph.D. degree with support of theHigher Education Commission (HEC) of Pakistan.

References

[1] S. Ali, S. Khusro, A. Rauf, and S. Mahfooz, “Sensors andmobile phones: evolution and state-of-the-art,” Pakistan Journalof Science, vol. 66, no. 4, pp. 385–399, 2014.

[2] Y. Wang, J. Lin, M. Annavaram et al., “A framework of energyefficient mobile sensing for automatic user state recognition,” inProceedings of the 7th ACM International Conference on MobileSystems, Applications, and Services (MobiSys ’09), pp. 179–192,Krakow, Poland, June 2009.

[3] J. Subercaze, P. Maret, N. M. Dang, and K. Sasaki, “Context-aware applications using personal sensors,” in Proceedings of the2nd International Conference on Body Area Networks (ICST ’07),pp. 1–5, Florence, Italy, 2007.

[4] P. Korpipaa and J. Mantyjarvi, “An ontology for mobile devicesensor-based context awareness,” in Proceedings of the 4thInternational and Interdisciplinary Conference (CONTEXT ’03),pp. 451–458, Stanford, Calif, USA, June 2003.

[5] D. J. Russomanno, C. R. Kothari, and O. A. Thomas, “Buildinga sensor ontology: a practical approach leveraging ISO andOGCmodels,” in Proceedings of the International Conference onArtificial Intelligence (ICAI ’05), pp. 637–643, Las Vegas, Nev,USA, June 2005.

Page 25: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

Journal of Sensors 25

[6] C. A. Henson, J. K. Pschorr, A. P. Sheth, and K. Thirunarayan,“SemSOS: semantic sensor observation service,” in Proceedingsof the International Symposium on Collaborative Technologiesand Systems (CTS ’09), Baltimore, md, USA, May 2009.

[7] H. Neuhaus and M. Compton, “The semantic sensor networkontology: a generic language to describe sensor assets,” in Pro-ceedings of the AGILE 2009 Pre-ConferenceWorkshopChallengesin Geospatial Data Harmonisation, Hannover, Germany, 2009.

[8] M. Compton, P. Barnaghi, L. Bermudez et al., “The SSN ontol-ogy of theW3C semantic sensor network incubator group,”WebSemantics: Science, Services and Agents on the World Wide Web,vol. 17, pp. 25–32, 2012.

[9] N. F. Noy and D. L. McGuinness, “Ontology Development 101:A Guide to Creating Your First Ontology,” 2001, http://www.ksl.stanford.edu/people/dlm/papers/ontology-tutorial-noy-mcguinness.pdf.

[10] D. J. Russomanno, C. Kothari, and O. Thomas, “Sensor ontolo-gies: from shallow to deep models,” in Proceedings of the 37thSoutheastern Symposium on System Theory (SST ’05), pp. 107–112, Tuskegee, Ala, USA, March 2005.

[11] S. Avancha, C. Patel, and A. Joshi, “Ontology-driven adaptivesensor networks,” in Proceedings of the 1st Annual InternationalConference on Mobile and Ubiquitous Systems (MobiQuitous’04), pp. 194–202, Boston, Mass, USA, August 2004.

[12] M. Calder, R. A. Morris, and F. Peri, “Machine reasoning aboutanomalous sensor data,” Ecological Informatics, vol. 5, no. 1, pp.9–18, 2010.

[13] M. Eid, R. Liscano, and A. El Saddik, “A universal ontology forsensor networks data,” in Proceedings of the IEEE InternationalConference on Computational Intelligence for Measurement Sys-tems and Applications (CIMSA ’07), pp. 59–62, IEEE, Ostuni,Italy, June 2007.

[14] M. Gomez, A. Preece, M. P. Johnson et al., “An ontology-centric approach to sensor-mission assignment,” in KnowledgeEngineering: Practice and Patterns, A. Gangemi and J. Euzenat,Eds., vol. 5268 of Lecture Notes in Computer Science, pp. 347–363, Springer, Berlin, Germany, 2008.

[15] J.-H. Kim, H. Kwon, D.-H. Kim, H.-Y. Kwak, and S.-J. Lee,“Building a service-oriented ontology for wireless sensor net-works,” in Proceedings of the 7th IEEE/ACIS InternationalConference on Computer and Information Science (IEEE/ACISICIS ’08), pp. 649–654, Portland, Ore, USA, May 2008.

[16] M. Botts and A. Robin, “OpenGIS� Sensor Model Language(SensorML) Implementation Specification,” 2007 http://portal.opengeospatial.org/files/?artifact id=21273.

[17] A. Pease, I. Niles, and J. Li, “The suggested upper merged ontol-ogy: a large ontology for the semantic web and its applications,”in Proceedings of the AAAI-2002 Workshop on Ontologies andthe Semantic Web, Edmonton, Canada, 2002.

[18] M. Compton, C. Henson, L. Lefort, H. Neuhaus, and A.Sheth, “A survey of the semantic specification of sensors,” inProceedings of the 8th International Semantic Web Conference(ISWC ’09), 2nd International Workshop on Semantic SensorNetworks, Washington, DC, USA, 2009.

[19] A. Underbrink, K. Witt, J. Stanley, and D. Mandl, “Autonomousmission operations for sensor webs,” in Proceedings of theAmerican Geophysical Union, Fall Meeting 2008, San Francisco,Calif, USA, December 2008.

[20] L. Bermudez, E. Delory, T. O’Reilly, and J. Del Rio Fernandez,“Ocean observing systems demystified,” in Proceedings of theMTS/IEEE Biloxi—Marine Technology for Our Future: Global

and Local Challenges (OCEANS ’09), pp. 1–7, Biloxi, Miss, USA,October 2009.

[21] C. Schlenoff, T. Hong, C. Liu, R. Eastman, and S. Foufou,“A literature review of sensor ontologies for manufacturingapplications,” in Proceedings of the 11th IEEE InternationalSymposiumonRobotic and Sensors Environments (ROSE ’13), pp.96–101, October 2013.

[22] Y. Shi, G. Li, X. Zhou, and X. Zhang, “Sensor ontology buildingin semantic sensor web,” in Internet of Things, Y. Wang and X.Zhang, Eds., vol. 312, pp. 277–284, Springer, Berlin, Germany,2012.

[23] M. C. Suarez-Figueroa, A. Gomez-Perez, and M. Fernandez-Lopez, “The NeOn methodology for ontology engineering,”in Ontology Engineering in a Networked World, M. C. Suarez-Figueroa, A. Gomez-Perez, E. Motta, and A. Gangemi, Eds., pp.9–34, Springer, Berlin, Germany, 2012.

[24] S. Ali, S. Khusro, and H. Chang, “POEM: practical ontologyengineering model for semantic web ontologies,” Cogent Engi-neering, vol. 3, no. 1, pp. 1–39, 2016.

[25] V. Presutti and A. Gangemi, “Content ontology design patternsas practical building blocks for web ontologies,” in Proceedingsof the 27th International Conference onConceptualModeling, pp.128–141, Berlin, Germany, 2008.

[26] X. Wang, X. Zhang, and M. Li, “A survey on semantic sensorweb: sensor ontology,mapping andquery,” International Journalof u-and e-Service, Science andTechnology, vol. 8, no. 10, pp. 325–342, 2015.

[27] E. Simperl, “Reusing ontologies on the Semantic Web: a feasi-bility study,” Data & Knowledge Engineering, vol. 68, no. 10, pp.905–925, 2009.

[28] D. Preuveneers, J. V. den Bergh, D. Wagelaar et al., “Towardsan extensible context ontology for ambient intelligence,” inProceedings of the 2nd European Symposium on Ambient Intel-ligence, pp. 148–159, Eindhoven, Netherlands, 2004.

[29] J. Brank, M. Grobelnik, and D. Mladenic, “A survey of ontologyevaluation techniques,” in Proceedings of the 8th InternationalMulti-Conference Information Society, pp. 166–169, 2005.

[30] T. J. Lampoltshammer and T. Heistracher, “Ontology evaluationwith Protege using OWLET,” Infocommunications Journal, vol.6, pp. 12–17, 2014.

[31] H. Hlomani and D. Stacey, “Approaches, methods, metrics,measures, and subjectivity in ontology evaluation: a sur-vey,” 2014, http://www.semantic-web-journal.net/system/files/swj657.pdf.

[32] A. Gomez-perez, Ontology Evaluation, Springer, Berlin, Ger-many, 1st edition, 2004.

[33] S. Tartir and I. B. Arpinar, “Ontology evaluation and rankingusing OntoQA,” in Proceedings of the 1st IEEE InternationalConference on Semantic Computing (ICSC ’07), pp. 185–192,Irvine, Calif, USA, September 2007.

[34] S. Tartir, I. B. Arpinar, and A. Sheth, “Ontological evaluationand validation,” in Theory and Applications of Ontology: Com-puter Applications, R. Poli, M. Healy, and A. Kameas, Eds., pp.115–130, Springer, Amsterdam, The Netherlands, 2010.

[35] Y. Shi, G. Li, X. Zhou, and X. Zhang, “Sensor ontology buildingin semantic sensor web,” in Internet of Things, Y. Wang and X.Zhang, Eds., vol. 312, pp. 277–284, Springer, 2012.

[36] A. Burton-Jones, V. C. Storey, V. Sugumaran, and P. Ahluwalia,“A semiotic metrics suite for assessing the quality of ontologies,”Data andKnowledge Engineering, vol. 55, no. 1, pp. 84–102, 2005.

Page 26: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

26 Journal of Sensors

[37] M. Horridge, H. Knublauch, A. Rector, R. Stevens, and C.Wroe, “A Practical Guide To Building OWL OntologiesUsing The Protege-OWL Plugin and CO-ODE Tools,” 2004,http://mowl-power.cs.man.ac.uk/protegeowltutorial/resources/ProtegeOWLTutorialP4 v1 3.pdf.

[38] E. Blomqvist, V. Presutti, E. Data, and A. Gangemi, “Experi-menting with eXtreme design,” in Proceedings of the Proceedingsof the 17th International Conference on Knowledge Engineeringand Management by the Masses EKAW, vol. 10, pp. 120–134,Lisbon, Portugal, 2010.

[39] S. Khusro, S. Ali, A. Rauf et al., “Unleashing sensor data onlinked open data—the story so far,” Life Science Journal, vol. 10,no. 4, pp. 1766–1786, 2013.

[40] C. Gurrin, Z. Qiu, M. Hughes et al., “The smartphone as aplatform for wearable cameras in health research,” AmericanJournal of Preventive Medicine, vol. 44, no. 3, pp. 308–313, 2013.

[41] N. E. Petroulakis, I. G. Askoxylakis, and T. Tryfonas, “Life-logging in smart environments: challenges and security threats,”in Proceedings of the IEEE International Conference on Commu-nications (ICC ’12), pp. 5680–5684, Ottawa, Canada, June 2012.

[42] P. Wetterwald, “Android@Home,” in Proceedings of the GoogleI/O Developer Conference, San Francisco, Calif, USA, 2011.

[43] D. Choujaa and N. Dulay, “Activity recognition using mobilephones: achievements, challenges and recommendations,” inProceedings of the Workshop on How to Do Good Research inActivity Recognition: Experimental Methodology, PerformanceEvaluation and Reproducibility in Conjunction with UBICOMP,2010.

[44] N. Ravi, N. Dandekar, P. Mysore, and M. L. Littman, “Activityrecognition from accelerometer data,” in Proceedings of the 17thConference on Innovative Applications of Artificial Intelligence,vol. 3, pp. 1541–1546, Pittsburgh, Pennsylvania, July 2005.

[45] A. Khan, S. Khusro, A. Rauf, and S. Mahfooz, “Rebirth ofaugmented reality—enhancing reality via smartphones,” BahriaUniversity Journal of Information & Communication Technolo-gies, vol. 8, no. 1, pp. 110–121, 2015.

Page 27: SmartOntoSensor: Ontology for Semantic Interpretation of ...downloads.hindawi.com/journals/js/2017/8790198.pdf · (i)The ontology has been developed for representing information in

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Journal ofEngineeringVolume 2014

Submit your manuscripts athttps://www.hindawi.com

VLSI Design

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

The Scientific World JournalHindawi Publishing Corporation http://www.hindawi.com Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Modelling & Simulation in EngineeringHindawi Publishing Corporation http://www.hindawi.com Volume 2014

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

DistributedSensor Networks

International Journal of