cloud-based facility management benchmarking - … · cloud-based facility management benchmarking...

10

Click here to load reader

Upload: dokiet

Post on 15-Jul-2018

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

Cloud-based Facility Management Benchmarking

Sofia [email protected]

Instituto Superior Tecnico, Lisboa, Portugal

May 2015

Abstract

Benchmarking the performance of facilities is being hailed as an effective driver for change andcontinuous improvement that is expected to increase the overall effectiveness of Facilities Management(FM) practice. However, and despite the fact that modern FM is backed by specific software to monitordifferent categories of indicators, a principled and systematic approach to FM benchmarking is stilllacking—a state of affairs that can be explained mostly by the lack of a commonly-agreed indicators forfacilities. This article discusses the existing frameworks for facilities benchmarking, taking into accountboth scientific literature and current industry practice. Moreover, a set of normalized indicators is alsoderived that can be used as a basis to create a computational FM benchmarking solution enablingstakeholders to contrast the performance of their facilities.Keywords: Facilities Management, Benchmarking, Key Performance Indicators

1. Introduction

Facility Management (FM) is a non-core activitythat supports organizations in the pursuit of theirobjectives (core business), and is considered “thepractice of coordinating the physical of business ad-ministration, architecture, behavior and engineeringscience” [15].

Over the past three decades FM has known asignificant development due to a number of fac-tors ranging from an increase in construction costs,increased performance requirements by users andowners, to a greater recognition of the effect of spaceon productivity [17]. Consequently, Facilities Man-agement has matured to a professional activity andit is estimated that, along with related activities, itrepresents more than 5% of the GDP of both Euro-pean countries and the US [18].

Since facilities-related expenditure is a bigslice of organization’s base cost, FM has beenequipping itself with appropriate tools [28, 23],such as Computer Aided Facility Management(CAFM), Computerized Maintenance ManagementSystems (CMMS), Building Automation Systems(BAS), Energy Management Systems (EMS), En-terprise Resource Planning (ERP), and IntegratedWorkspace Management System (IWMS).

These software applications rely on large amountsof integrated data allowing the ability to extractmeasures and indicators, and with them calculateKey Performance Indicators (KPI). Indeed, theseKPIs give important insight into the functioning ofFM activities—keeping track of KPIs is one aspect

of quality control [7]—as well as enabling organiza-tions to compare each other on performance of theirfacilities and services.

Benchmarking provides important advantagesto FM such as justification of energy consump-tion, costs, identification of weaknesses/ threats,strengths/ opportunities and best practices, and theaddition of value to facilities integrating them inCAFM systems and supporting maintenance man-agement. Although some organizations have theirown benchmarking software, this software is notcompatible between distinct organizations. Thereis no centralized mechanism to integrate all thesedata. Therefore, benchmarking between FM orga-nizations is not yet possible.

Furthermore, similar to other management disci-plines, it is still not clear which are the most impor-tant KPIs to be used in FM. Moreover, the successof benchmarking strategy depends on a number oforganizational factors such as the management sup-port behind it, the personality of the managers, theorganizational structure, and the FM contract [8].

This paper discusses the FM benchmarking prob-lem, identifies the main benchmarking indicators ofinterest and provides a survey about the main Facil-ity Management tools and different Facilities Man-agement standards. Finally, a set of indicators isdiscussed and proposed to be used as computationalsupport for FM Benchmarking.

1

Page 2: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

2. State-of-the-Art2.1. International Standards for FM Benchmarking

The need for FM standards has been recognized inmultiple sources [32]. In this research work, we se-lect the most important ISO—largest developer ofvoluntary International Standards covering all as-pects of technology and business—standards for FMand Maintenance for those working with facilities invarious stages of their life cycle, ordered by ICS—structure for catalogs of international, regional andnational technical standards.

2.2. Benchmarking

Benchmarking is the process that determines whois the best. This is very important in business toundestand who is the best sales organization, howto quantify a standard or who has less cleaningexpenses [26]. Moreover, benchmarking identifieswhat other businesses do to increse profit and pro-ductivity to subequently adapt those methods onyour organization and make business more compet-itive [31].

The importance of standards resides in the cre-ation of specifications, which normalize how someactivity is performed. In the case of benchmark-ing, standardizing how companies evaluate theirFM data enables compatibility between organiza-tions, and so, there is no misinterpretation betweendifferent organizations for a given measurement.

Moreover, we could conclude that a cloud bench-marking FM system between several organizationsand facilities on a Portuguese scope do not exist yet,and would bring a great impact for our country’seconomy, since FM represents about 5% of globalGross Domestic Product [25, 22].

2.3. Existing Software Solutions

There are many types of softwares for FM solutions,just as CAFM or IWMS. All of known FM solutionslike Maxpanda [21] or IBM Tririga [13] are a simplerway to manage facilities. They centralize organi-zations information, making management more ef-ficient through business analytics, critical alert, in-creasing visibility and control. Some of them, for in-stance ARCHIBUS [1] or FM:Systems [9] have inte-gration with CAD or BIM models, which is very im-portant for visualization of departments occupationor others space and occupation management areas.Most of these systems promotes their capabilitiesfor organizations cost reduction—since they cost-justify real changes in preventive maintenance rou-tines and predicts cost effects of preventive mainte-nance changes before they are made—some permitsmultiple users, others make possible that each useronly can access specific information regarding hisorganization position.

Solutions as Indus System [14], Manhattan Mo-bile Apps [19], PNMSoft [24] or ARCHIBUS have

cloud-based software that enables users to accessFM systems anywhere on mobile devices from abrowser. Indus System enables users to store, share,view drawings, space, assets, related costs, leasesand contracts just by accessing the browser.

On the other hand, ARCHIBUS and PNMSoftare both capable of showing an organizations KPIsthrough their web site. The packages enables usersnormal usage of their daily management softwareand then, when necessary, the visualization of theresults on a graphical web application. However,this solution is only applicable for the facilities thathave ARCHIBUS or PNMSoft software installed,and only for comparison from previous results fromthat facility. In contrast, with our solution, any or-ganization could benefit from these features and onemore: the comparison with others organizations onthe same sector. On Table 1 is presented a summaryof the different FM softwares characteristics.

We asked the collaboration of the providers ofMaxpanda, IBM Tririga, ARCHIBUS, FM:Systemsand Indus System to understand which are the in-dicators on which each of these softwares are morefocus. We concluded that are many Financial in-dicators that are not considered by the providers,also, indicators of Service Quality, Satisfaction orSpatial Indicators are highly applied on softwares.These differences are related to the software clas-sification. Software B is a CMMS solution, andthat is the reason why the it is focused on main-tenance and cleaning indicators, while Software Ais a CAFM solution, therefore, it is not focused onthose indicators.

3. Scientific Literature

In order to understand what are the most relevantKPIs for the field of FM we combine the resultsof distinct authors. Table 2 presents the variousKPIs pointed out by the scientific literature alongwith the frequency of reference. The most citedindicators are Financial and Spacial, followed byMaintenance and Cleaning. The Quality of Ser-vice/Product and Client Satisfaction are also im-portant indicators.

The methodology to achieve a list of indicatorswith potential to be applied for a higher numberof organizations, require analyzing of distinct in-dicators frequency on standard benchmarking sur-veys and on FM software. However, surveys andsoftwares present significant differences between theindicators frequency, with the exception of certainfinancial and spacial indicators (5 in total). It turnsout that software packages are more focused on per-formance, service and satisfaction indicators, butnot maintenance indicators as found in the litera-ture table.

2

Page 3: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

Software SolutionsCentralization of

OrganizationsInfo

BusinessAnalysis

IncreasedVisibility and

ControlCosts Reduction

CAD/BIMIntegration

CloudApplication

Benchmarking

Maxpanda • • • •

IBM Tririga • • • •

FM:Systems • • • • •

Indus System • • • • •

PNMSoft • • • • • •

ARCHIBUS • • • • • • •

Table 1: List of distinct FM softwares characteristics.

IndicatorsUSA, UK andChile Projects

[5]IFMA [28] [12] [20] [11] Total

Financial IndicatorsTotal Cleaning Cost • �Cleaning Cost per m2 • �Total Maintenance Cost • • ��FM CostsUtility Costs • �Facility Budget/Corporation Budget • �Occupancy Cost per m2 • • • ���Space Costs per m2 • • ��Operation Cost per m2 • • ��Moving Costs • �HSSE Costs • �Security Costs • �Logistic CostsHospitality CostsProject Costs (Deviation) • • • ���Financial Ratios • • • ���Annual Income • • ��Total Annual Facility Cost • • • ���Annual Cost of Energy Purchased • �Total Environment Cost • • ��Outdoor Costs • �Spacial IndicatorsNet Floor Area per FTE • • • • ����Percentage Net Floor Area • • • ���Percentage Internal Area • • • ���Percentage Gross Floor Area • • • ���Support Area • �Maintenance/Cleaning IndicatorsRepairs VS Preventive Maintenance • • ��Asset Replacement Values • • ��Percentage of Area Cleaned • • ��Productivity IndicatorsCore operating hours of facility (FM)Uptime Facility (Business)Staff Turnover (Human Resources) • �Absenteeism (Human Resources) • �Environmental IndicatorsCO2 emissionsTotal Energy Consumption • �Total Water UsageTotal Waste Production

Service Quality IndicatorsQuality of Service/Product • • ��Quality of Cleaning • �Quality of Workplace • �Quality of Security • �Quality of Reception and Contact CenterQuality of Document Management

Satisfaction IndicatorsClient Satisfaction • • ��Satisfaction with FMSatisfaction with SpaceSatisfaction with Outdoors dd dSatisfaction with CleaningSatisfaction with WorkspaceSatisfaction with HSSE (Health, Safety, Secu-rity and Environment)

• �

Satisfaction with HospitalitySatisfaction with ICT (Information and Com-munication Technology)Satisfaction with Logistics

Table 2: List of KPIs covered by the literature. The rightmost column represents the total of documentsand tools that reference a specific KPI.

3.1. Priority Analysis

In order to understand which are the most impor-tant KPIs for FM, we need to combine data fromvarious scientific literature with the existing solu-tions, enabling the identification of the most rel-

evant KPIs based on importance and frequency.However, some of the KPIs mentioned in literaturemay not be the most suitable. Sometimes therewere repeated KPIs with different names or KPIsthat were included in other less specific KPIs, for

3

Page 4: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

example, the indicator Life Planning Costs is partof the indicator HSSE Costs. Moreover, some ofthe KPIs most frequently reported in the literaturehave necessary measurements to their calculationthat are very difficult to obtain, for example CO2

Emissions indicator, so, organizations do not wantto have all the work to obtain them, thus, they arenot able to gather enough data to calculate all thepreviously referred KPIs.

The indicators which are the most prevalent inliterature and software applications were chosen forthe final set of KPIs. For example, Client Satisfac-tion has 3 references in Table 2 and one referencein a corresponding table about indicators on todayssoftwares, making a total of four references. Thus,Client Satisfaction was chosen to integrate the fi-nal list. However, the indicator Occupancy Costper Operation Costs was also chosen, which is anaggregation of two indicators: Occupancy Cost perm2 and Operation Cost per m2. The final list ofKPIs can be seen in Table 3.

4. Solution ProposalGiven the limitations of the current benchmarkingsolutions there is a clear necessity for new and inno-vative benchmarking applications, a new solution isin order. The solution should give FM Benchmark-ing results to organizations through a graphical rep-resentation of their KPIs, aggregating different or-ganizations, with different FM systems. Indeed, ag-gregation of different organizations data is the mostimportant feature on the solution.

The solution brings better insights about orga-nizations FM position relatively to others, since ituses the gathered information to compare facilities.Organizations and facilities identities are classified,however, the ranking by their results are still pos-sible. This Cloud-based solution is developed asa Web Application, which is divided in two parts:Server side and Client side. The server side runsdirectly on the hosting servers, while the client sideruns on the browser as an endpoint to the server.To better understand the solution, an overview ofthe architecture can be found on Figure 1. Thisorganization of the system enables the solution to:

• Authentication service to authorize the usersof a organization.

• Integrate data of distinct organizations.

• Database Caching for better performance.

4.1. RequirementsTo develop an interface, the user opinion is very im-portant, and so, to achieve a system well acceptedby final users, all concept as to be developed in or-der to please their needs. For this reason, is essen-tial to establish a methodology where all steps lead

Figure 1: FM benchmarking system architecture.Organizations send their data on a standard for-mat, to be stored on a database. This info is thenaccessed by different users representatives of eachorganization.

to a well designed, easy to use and useful interface.Thus, our interface development process consists onthe following steps:

1. Elicitation Interviews with experts to elicitsystem requirements.

2. User Requirements specification establishwho the stakeholders are and what are theirneeds. A questionnaire was used to understandwhich metrics and indicators users would liketo see on the first version of the system.

3. Functional Requirements and Con-straints definition specify exactly what thesystem must do.

4. Defining a system domain model to un-derstand how the components relate to eachother.

5. Creating a conceptual interface modelwhich consists on creating low-fidelity sketchesin order to integrate all requirements into thescreens. This step is not trivial, and so, tohelp define the essential interactions and to pri-oritize them, is useful to create a conceptualmodel based on the previous steps.

The previous methodology was applied to cre-ate and develop the benchmarking reports interface.To support implementation of the previous steps,we used Confluence team collaboration software [2],which integrates several tools that enable a simpleelaboration of models, mockups and requirements ofour system. This phase enabled uncovering incon-sistencies upfront before starting the implementingphase.

4.2. Creating the interfaces conceptual modelIn order to develop a Benchmarking system that in-tegrates distinct facilities and organizations, a sim-ple and user-friendly interface was developed, for

4

Page 5: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

Indicator Units DescriptionFinancial IndicatorsCleaning Cost per m2 e/m2 Total Cleaning Cost/Net Room Area OR To-

tal Cleaning Costs/Net Floor AreaTotal Maintenance Cost e Sum of costs of maintenance for electricity

equipment, HVAC, elevators, escalators, gen-erators, UPS, ICT maintenance, etc

FM Costs e Costs of FM department OR FM outsourcingUtility Costs e Sum of costs for water, electricity, oil, gas and

othersSpace Costs per m2 e/m2 Total Space Costs/Net Floor AreaOccupancy Cost e/m2 Total Occupancy Cost/Net Floor AreaOccupancy Cost per Operation Costs % (Occupancy Cost/Total Operation Costs)*100Spacial IndicatorsNet Floor Area per FTE m2/FTE Net Floor Area/Number of FTE personnelPercentage Net Floor Area % (Net Floor Area/Total Level Area)x100Percentage Internal Area % (Internal Area/Total Level Area)x100Percentage Gross Floor Area % (Gross Floor Area/Total Level Area)x100Maintenance/Cleaning IndicatorsRepairs VS Preventive Maintenance (by specialty) % (Number of Corrective Maintenance per

month/Number of Preventive Maintenanceper month)x100

Asset Replacement Values (by specialty) % (Annual Maintenance Cost/Maintained As-sets Replacement Value)x100

Percentage of Area Cleaned % Area Cleaned/Net Floor AreaProductivity IndicatorsStaff Turnover % (Number of Employee Departures

(FTE)/Average Number of Staff Members(FTE) Employed)x100

Absenteeism % (Total Days Lost/Total Possible DaysWorked)x100

Environmental IndicatorsTotal Energy Consumption kWhTotal Water Usage m3

Service Quality IndicatorsQuality of Service/Product Scale Values Obtained Through Audits or

QuestionnairesQuality of Cleaning Scale Values Obtained Through Audits or

QuestionnairesQuality of Workplace Scale Values Obtained Through Audits or

QuestionnairesQuality of Security Scale Values Obtained Through Audits or

QuestionnairesSatisfaction IndicatorsClient Satisfaction % Values Obtained Through QuestionnairesSatisfaction with HSSE % Values Obtained Through Questionnaires

Table 3: Final list of Proposed KPIs [4, 5, 16, 27, 3].

better understanding and satisfaction of the plat-form by users. This interface was evaluated incre-mentally to enable an individual assessment of eachcomponents. The base system has two main fea-tures which can be performed by the user:

Data Insertion for user to create and insert or-ganizations and facilities information: name, ad-dress, NIF and all correspondent metrics. Thisdata input can either be inserted by hand orthrough an excel file import.

Indicators Report for user to visualize andstudy all indicators calculated by the systemthrough the previous user metric input. Theseindicators will be shown through a set of severalcharts where can be made different types of com-parisons by facility or between them.

4.3. Solution Architecture

As explained before, the solution proposed is acloud web application, therefore, its architecturehas two main modules: Server Side and Client Side.Another important component of the solution is thedatabase and how it is formulated. These threecomponents are explained with more detail above:

Client Side runs on the user web browser byperforming series of HTTP requests to the server.It was used Bootstrap, a framework that imple-ments responsive design best practices, which en-ables rapid, high quality CSS and HTML devel-opment. For the generation of the graphics wasused the Javascript library Highcharts.

Server Side is where the application is runningand is responsible for processing and storage ofdata sent by the Organization to the database.Ruby on Rails was used as the server side frame-

5

Page 6: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

Figure 2: Systems architecture overview. This im-age shows a clear contrast between Server Side andClient Side of the application, and how they inter-act with each other.

work, which is also responsible for the generationof HTML templates that are sent to the ClientSide.

Database PostreSQL was used as the relationaldatabase, which is theoretically divided in threeparts: Input Data Staging Area, KPI AggregatedData and Facility Metadata.

These server and client sides interact with eachother through HTTP requests—the overall archi-tecture is illustrated on Figure 2, where can be seenclearly the division between these two modules.

4.4. Server SideAll the software logic is on the Server side of theapplication. Moreover, all the processing and datastorage is executed here. The authentication anddata management and update (CRUD - Create,Read, Update and Delete) are responsibility of theServer. There are six main entities on this solution:

User: Each user can have more than one roleand several facilities. Also, each user can onlyaccess his facilities data.

Role: Each role represents the part played bythe user to each facility bonded to that specificrole. There are four distinct roles: Facility Man-ager, Occupant, Owner and Service Provider.

Facility: Represents a facility of a specific orga-nization. For example, a specific building of anuniversity.

Static Measure: These are facilities’ measuresthat rarely change, like Net Floor Area or Full

Time Equivalent, unless there are major trans-formations on the facility. The input of thesemeasures is rare, and because of that, they hadto be treated differently.

Measure: These measures have to be insertedon the application each month and are bondedto a specific facility. Examples of this type ofmeasures are: Water or Energy Consumption, orCleaning Costs. Because some of these measuresdon’t represent a specific month—for instance,half January and half February—it was impor-tant to transform them to represent a specificmonth for the benchmarking. Thus, was applieda granular transformation of the measure frommonth to day—we found the average value foreach day—, and then, the days were transformedto a specific month, explained on Section ??.

Benchmarking: Applying measures and staticmeasures, a set of metrics are calculated and usedto benchmark facilities. These benchmarking isshown through several charts.

Exemplifying, a user can have the Role ”Facil-ity Manager of IST”. In this case, this role isbonded with several facilities— Taguspark build-ing, Central Building in Alameda, among other ISTbuildings. Each facility has its owns measures andstatic measures, and therefore, its own benchmarkreports.

4.5. Database LayerThe relational database architecture developed hadtwo main components—Input Data Staging Areaand Facility Metadata. The Input Data StagingArea is responsible for the data sent by the or-ganization storing—measures—, while the FacilityMetadata stores standard facility information suchas address or name. Each time a user accessesthe benchmarking page, it is necessary to computethe benchmark data and present it through charts.However, this process was very time and memoryconsuming because it was necessary to compute allfacility’s indicators each time the page was accessed,and consequently, the system would be slower.

Therefore, it was essential to design a databasearchitecture different from the previous one. Wasthen decided to add another component to thedatabase—the KPI Aggregated Data. This com-ponent uses the Input Data Staging Area data tocalculate the KPIs and store them—it works as acache of indicators. Therefore, each time the userimports a xls file with measures, all correspondentindicators are calculated and stored. Thus, it is notnecessary to calculate the indicators in run-time,but just query them on the database. This updateon the architecture makes possible to deliver KPIsfaster and less memory consuming—validation on

6

Page 7: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

Figure 3: Database without cache. It receives thedata and stores it on Input Data Staging Area.

Figure 4: Database with cache. It receives the dataand stores it on Input Data Staging Area, then,when the KPIs are calculated they are stored atKPI Aggregated Data.

Section 5.3. Both database architectures are pre-sented in Figures 3 and 4.

4.6. Client Interface

The Client Side runs on the browser of the userconnecting to the website. The interface enablesthe user to interact with the application, which isalso where the statistics about the organization arepresented. Because the interface is really importantand is a key element on the application appealing,it was necessary some iterations to achieve a moreclean and easier user interactivity.

The first contact the user has with the solutionis the Landing Page, where is presented a small de-scription of the main features of the solution, andwhere the user can Register or Log In into the site.

After the user logs in the site, is available a com-plex system, where distinct components are pre-sented in three major elements. This elements areorganized as follows:

Top Bar has three clear parts 1. the logo (whichis also a button to the Home page), 2. the displayof the current selected role (which redirect to theRole Edition page) and 3. the user name (whenclicked, opens a dropdown with buttons to Ac-count Edition page and to Log Out).

Left Side Bar: has five distinct parts 4. Detailsdropdown (links each listed facility to their De-tails page), 5. Metrics dropdown (links each listedfacility to their Metrics page), 6. Reports drop-down (links each listed facility to their Bench-marking Reports page), 7. Add Facility button(opens the New Facility page) and 8. Minimizebutton (to minimize the side bar width).

Inner Page: consists on only one part 10. partof page where the distinct screens are rendered.

5. EvaluationThis section describes the validation of the proposedsolution. It is very important to validate i) indica-tors utility, ii) utility of a cloud solution for bench-marking, iii) solution usability and iv) solution per-formance. The indicators validation is presented onnext subsection. For the usability and performancevalidation of the solution, were performed UsabilityTests (evaluates the application by testing it withusers), Qualitative Tests (to gather users opinions)and Performance Tests (in order to test the appli-cation efficiency). It also presents the test resultsand their analysis.

5.1. Validation of the IndicatorsTo understand if the previously selected set of KPIsfor the systems is useful, a questionnaire was madeto possible users, where we asked to rate the KPIsbetween 0 (not useful) and 10 (very useful). Thispossible users can be Facility Owners, Facility Man-agers, Facility providers or Facility Occupants. Thequestionnaire was answered by four people fromfour different sectors: Facility Services, Energy Sec-tor, Real Estate and Public Administration. Thequestionnaire results are presented on Table 5. Ascan be seen, all indicators had a high average rate—lower rate average was 6.5 and the highest was 8.5.Through the results gathered, we can concludedthat, from a general point of view, all indicatorshad a good rate and, therefore, the list of KPIsdoes not need to be modified.

5.2. Usability TestsUsability testing provides a direct perception onhow the users interact with the application andthe common errors and difficulties. A usability testmodel computing 5-7 representative users generallyfinds about 80% of the problems [6]. These testsfind better ways to improve the application inter-face or functionality by asking users directly whatshould be improved. Thus, as a result of usabilitytests, we can understand if the application interfaceis well designed and perceptible.

Testing scenarios give participants a backgroundto complete the tasks, establishing introductory in-formation and the rationale of the task. For thesereasons, scenarios should be realistic enabling test

7

Page 8: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

Figure 5: The minimum, maximum and averagerate for each indicator suggested in the KPI ques-tionnaires results. The average of the results ob-tained through the KPI questionnaire is representedby the bullet. All indicators had a good rate, beingall rates equal or above 6,5.

participants to relate to the situation being tested[30]. Accordingly, we established 3 distinct sce-narios with some tasks each. For each task willbe measured the time that each user take to com-plete it. If after three minutes the task is not com-pleted, it will be considered incomplete. Therefore,we present the tasks according to each scenario, onTable 4 that will be completed for each user testingthe prototype.

After the usability test, each participant filledout a usability questionnaire (selected from surveysabout ISO 9241 [10, 29]) covering human-systeminteraction aspects regarding Functionality, Designand Ease of use, Learning and Satisfaction. Eachaspect is measured through five point likert scale(from 1-”Totally Disagree”; 3-”Not agree nor dis-agree”, till 5-”Totally Agree”).

5.2.1 Results

Were asked five users, with a Facility Manage-ment background, to complete the Usability Test-ing. The generality of tasks, were complete withsuccess (only four out of a total thirty tasks wereincomplete). More specifically, one user couldn’tcomplete task 3 and 6, while two users couldn’tcomplete task 4. Also, users completed each taskwithin a period of time relatively short, as can be

Figure 6: The minimum, maximum and averagetime for each task completed by users. The averageof the results obtained through the KPI question-naire is represented by the bullet. All indicatorshad a good rate, being all medium times less thantwo minutes.

Figure 7: Average values from the likert scale foreach question on the Usability Questionnaire.

seen through the usability test results on Figure 6.

This shows the easy interaction between user andinterface, which is backed up by the usability ques-tionnaire results on Figure 7, where users presentedtheir opinion. Furthermore, is possible to verify thesatisfaction revealed by users on the questionnaire,where all aspects—Functionality, Design and Easeof Use, Learning and Satisfaction—had good resultson the likert scale, where the lowest average ratewas on question six. All other questions had anaverage above three on the likerts scale.

5.3. Cache Performance Tests

In order to test the cache efficiency on the database,the performance testing was divided in two stages.First, a first set of tests was run, where KPIs wereasked directly to the Input Data Staging Area. Onthe second stage testing, it was used a cache, andthe KPIs were asked to the KPI Aggregated Data.This way, it is possible to understand the perfor-mance optimization brought by the cache. For acoherent testing, the test workload comprised 15

8

Page 9: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

Description Time Incomplete

Scenario 1Task 1 Import the xls file ”facilityB.xls” for Facility B.Task 2 Change Facility A name to ”Testing Facility A”.Task 3 Change the Facility Manager role to Owner.

Scenario 2Task 4 Compare your ”Cleaning Costs” performance to previous year.Task 5 Verify what you can change to improve your next year results.

Scenario 3 Task 6 Verify your space of improvement of ”Space Experience” in 2012 relatively to other facilities.

Table 4: Table of tasks to be completed for users in a three minute window each.

concurrent users trying to access the same URL onboth testing performance stages.

Base Testing: The Benchmarking Page URLof two distinct users, without an implementedcache. The results can be seen on Table 5.

Cached Testing: The Benchmarking Page wastested again with the same two distinct users,now the application has the indicators cached.Therefore, they are not calculated each time theBenchmarking Page is requested. Results can beseen on Table 5.

From the performance testing results, can be con-cluded that, because now it is necessary to makemore queries to the database in order to get the in-dicators, the number of transactions is also higher,and so is the quantity of data transfered. However,it is not necessary to calculate all indicators eachtime the web page is requested.

5.4. DiscussionThrough the different tests carried out the solution,it was possible to validate indicators utility, solu-tion usability and the solution performance. TheIndicators validation showed that users find mostof the chosen indicators useful, since all indicatorshad a high average rate (lower rate average was 6.5and the highest was 8.5). The usability testing ofthe solution reveal a easy and fast interaction user-interface. Also, most users showed great satisfac-tion and found the Cloud-based benchmarking so-lution high functional and easy to learn. Finally, theperforming testing exhibit the great potentiality ofthe indicators cache which decreases the transac-tions time, therefore the global performance of thesolution is also better. Overall, the results are verypromising.

6. ConclusionsBenchmarking is a fundamental tool for a continu-ous performance improvement on management witha great potential to be applied on Facility Manage-ment. However, the difficulties in finding a com-monly accepted set of metrics, combined with thelack of accurate metrics to drive decision-makingmakes this goal particularly challenging.

To date, there is no simple solution for facilitiesbenchmarking—which is our main goal in Europeand, thus, organizations continue to use distinct

software to support their FM and KPI gathering,hinders aggregation and analysis of data.

Overall, with the solution proposed in thisproject, quantifying real property performance willbe easier and organizations will have a better wayto evaluate its own FM metrics, while enabling thecomparison of metrics between enterprises and fa-cilities.

A new important feature to be added to this ap-plication is an information data exchange standardthat will allow different softwares interoperate andexchange FM indicators data with the applicationproposed —- integrated the data into our system.Thus, the manual input or import of different met-rics would not be necessary — the system wouldbecome much more clean, simple and easy of us-age.

References[1] Archibus Group. Archibus Homepage, May,

2014 2014.

[2] Atlassian Inc. Confluence Team Software,2015.

[3] CEN. EN 15341: Maintenance - MaintenanceKey Performance Indicators, 2005.

[4] D. B. Costa, H. R. Lima, K. B. Barth, andC. T. Formoso. Performance MeasurementSystems for Benchmarking in the BrazilianConstruction Industry: A Learning Approach.Porto Alegre Brasil, 2005.

[5] D. B. Costa, H. R. Lima, and C. T. Formoso.Performance Measurement Systems for Bench-marking in the Brazilian Construction Indus-try. International Symposium on Globalisa-tion and Construction AIT Conference Centre,Bangkok, Thailand, pages 1029–1039, 2004.

[6] L. L. Downey. Group Usability Testing: Evo-lution in Usability Techniques. Journal of Us-ability Studies, 2:133–144, 2007.

[7] C. T. Fitz-Gibbon. Performance indicators.Benmchmark, 12:31, 1990.

[8] FM Leaders Forum. BENCHMARKING:Effective Performance Management for FM,2013.

9

Page 10: Cloud-based Facility Management Benchmarking - … · Cloud-based Facility Management Benchmarking So a Martins so a.pereira.martins@ist.utl.pt Instituto Superior T ecnico, Lisboa,

ItemScenario 1 Scenario 2

Base Cached Base CachedTransactions (hits) 114 552 97 567Availability (%) 100 100 100 100Elapsed time (secs) 119.75 119.79 119.51 119.99Data transferred (MB) 4.41 21.32 3.75 21.90Server Response time (secs) 14.70 3.17 16.83 2.94Transaction rate (trans/sec) 0.95 4.61 0.81 4.73Failed transactions 0 0 0 0Longest transaction 19.43 7.25 22.08 5.06Shortest transaction 1.77 0.77 2.25 0.73

Table 5: Results from performance testing on Benchmarking Page (BP) from two distinct users—Scenario1 and 2. The Base presents the first database architecture results, and Cached presents the results afterthe cache implementation.

[9] FM:Systems Inc. FM:Systems Homepage,May, 2014 2014.

[10] G. Gediga, K.-C. Hamborg, and H. Willumeit.The IsoMetrics Manual. Universitat Os-nabruck, 1998.

[11] J. Hinks and P. McNay. The creation ofa management-by-variance tool for facilitiesmanagement performance assessment. Facil-ities, 17:31–53, 1999.

[12] D. C. Ho, E. H. Chan, N. Y. Wong, and M.-w.Chan. Significant Metrics for Facilities Man-agement Benchmarking in the Asia Pacific Re-gion. MCB University Press, 18:545–555, 2000.

[13] IBM Inc. IBM Tririga Homepage, May, 20142014.

[14] Indus Systems Inc. Indus Systems Homepage,May, 2014 2014.

[15] International Facility Management Associa-tion. IFMA Homepage, May, 2014 2014.

[16] IPD Environment Code. Measuring the Envi-ronmental Performance of Buildings, 2010.

[17] S. Lavy and I. M. Shohet. Performance-basedfacility management an integrated approach.International Journal of Facility Management,1(1):1–14, 2010.

[18] T. Madritsch, M. May, H. Ostermann, andR. Staudinger. Computer Aided FacilityManagement (CAFM) as a New Branch ofDecision-making Support Technologies in theField of Facility Management. In F. Adam andP. Humphreys, editors, Encyclopedia of Deci-sion Making and Decision Support Technolo-gies, pages 84–92, 2008.

[19] Manhattan Software Inc. Manhattan MobileApps Homepage, May, 2014 2014.

[20] K. Massheder and E. Finch. BenchmarkingMetrics Used in UK Facilities Management.MCB University Press, 16:123–127, 1998.

[21] Maxpanda Inc. Maxpanda Homepage, May,2014 2014.

[22] Per Anker Jensen. Facilities ManagementComes of Age, 2010.

[23] M. Pitt and M. Tucker. Performance Measure-ment in Facilities Management. Property Man-agement, 26:241–254, 2008.

[24] PNMSOFT Inc. PNMSOFT Homepage, May,2014 2014.

[25] Premises and Facilities Management. Premisesand Facilities Management Online, 2011.

[26] Reh, F. John. How to Use Benchmarking inBusiness, 2015.

[27] Richard Loth. Investopidea, February, 20142014.

[28] L. Roka-Madarasz. Facility Managementbenchmarking. 8th International Conferenceon Management, Enterprise and Benchmark-ing, pages 171–181, 2010.

[29] R. Safdari, H. Dargahi, L. Shahmoradi, andA. F. Nejad. Comparing Four Softwares Basedon ISO 9241 Part 10. J Med Syst, 36:27872793,2012.

[30] Tech Smith. Software Usability Testing withMorae, 2009.

[31] W., Alexis. How Do Businesses Use Bench-marking to Improve Productivity and Profit?,2015.

[32] K. Yu, T. Froese, and F. Grobler. A develop-ment framework for data models for computer-integrated facilities management. Automationin Construction, 9(2):145 – 167, 2000.

10