viseval

15
ViSEvAl ViSualisation and EvAluation http://team.inria.fr/stars/fr/2012/02/02/ viseval-software/

Upload: teagan-singleton

Post on 30-Dec-2015

27 views

Category:

Documents


0 download

DESCRIPTION

ViSEvAl. ViSualisation and EvAluation http://team.inria.fr/stars/fr/2012/02/02/viseval-software/. Overview. What is evaluation? Evaluation process Metric definition ViSEvAl Description Installation Configuration Functionalities. Evaluation process. General overview. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: ViSEvAl

ViSEvAl

ViSualisation

and

EvAluation

http://team.inria.fr/stars/fr/2012/02/02/viseval-software/

Page 2: ViSEvAl

Overview

What is evaluation? Evaluation process

Metric definition

ViSEvAl Description

Installation

Configuration

Functionalities

Page 3: ViSEvAl

Evaluation process

General overview

Page 4: ViSEvAl

Metric definition

Metric = distance + filter + criteria

Distance: associate detected and annotated objects

Spatial: compare bonding boxes area

Temporal: compare time intervals

Filter: selects which object to evaluate

Specific type, distance to the camera, ...

Criteria: how the property of detected and annotated objects is similar?

4 tasks: detection, classification, tracking, event detection

Page 5: ViSEvAl

ViSEvAl platform description

ViSEvAl: Interfaces

All functionalities (synchronisation,

display,...)

ViSEvAlGUI

ViSEvAlEvaluation

Distance

Object Filter

Frame Metric

Temporal Metric Event Metric

Loading Video

Tool

Core library

Plugin

Page 6: ViSEvAl

ViSEvAl plugins 1/2

Loading video ASF-Videos, Caviar-Images, JPEG-

Images, Kinect-Images (hospital), OpenCV-Videos (Vanhaeim), PNG-Images

Distance Bertozzi, dice coefficient, overlapping

Filter

CloseTo, FarFrom, Identity, TypeGroup

Page 7: ViSEvAl

ViSEvAl plugins 2/2 Criteria

Detection: M1.X 2 criteria (M1.1: area, M1.2: silhouette)

Classification: M2.X 2 criteria (M2.1: type, M2.2: sub type)

Tracking: M3.X 6 criteria (M3.1: F2F, M3.2: persistence, M3.4

(tracking time), M3.5: confusion, M3.6, M3.7: confusion + tracking time, M3.8: frame detection accuracy)

Event: M4.X 4 criteria (M4.1, M4.2: begin and end time, M4.3:

common frame, M4.4: common time)

Page 8: ViSEvAl

ViSEvAl: inputs

A set of XML files

Detection: XML1 file -> sup platform

Recognised event: XML3 file -> sup platform

Ground truth: xgtf file -> Viper tool

Time stamp file for time synchronisation : xml file -> createTimeStampFile.sh script provided by ViSEvAl

Page 9: ViSEvAl

ViSEvAl installation Get the sources

sup svn repository

cd sup/evaluation/ViSEvAl/

Run intall.sh at the root of ViSEvAl folder

Dependence: Librairies: QT4 (graphical user interface, plugin

management), gl and glu (opengl 3D view), xerces-c (XML read), opencv (video read)

Tool: xsdcxx (automatically compute C++ classes for reading XML files)

cd bin/appli; setenv LD_LIBRARY_PATH ../../lib

Run ./ViSEvAlGUI chu.conf

Page 10: ViSEvAl

ViSEvAl folder organisation src : appli, plugins (Cdistance, CeventMetric, CframeMetric,

CloadingVideoInterface, CobjectFilter, CTemporalMetric)

include : header files

install.sh, clean.sh

doc : documentation

lib : core library, plugins

scripts : createTimeStampFile.sh makeVideoFile.sh splitxml1-2-3file.sh

bin : ViSEvAlGUI, ViSEvAlEvaluation

tools : CaviarToViseval, QuasperToViseval

xsd : xml schemas

Page 11: ViSEvAl

ViSEvAl: configuration file

Configuration file based on Keyword-Parameter

SequenceLoadMethod "JPEG-Images” #"ASF-Videos“

SequenceLocation "0:../../example/CHU/Scenario_02.vid"

TrackingResult "0:../../example/CHU/Scenario_02_Global_XML1.xml"

EventResult "../../example/CHU/Scenario_02_Global_XML3.xml"

GroundTruth "0:../../example/CHU/gt_2011-11-15a_mp.xgtf"

XMLCamera "0:../../example/CHU/jai4.xml"

MetricTemporal "Mono:M3.4:M3.4:DiceCoefficient:0.5:TypeGroup"

MetricEvent "M4.2:M4.2.1:Duration:10

Page 12: ViSEvAl

ViSEvAl run trace

Mon, 11:15> ./ViSEvAlGUI Load all the plugins------------------------------------Loading video interfaces: ASF-VideosCaviar-ImagesJPEG-ImagesKinect-ImagesOpenCV-VideosPNG-Images------------------------------------Loading distance: 3DBertozzi3DDiceCoefficient3DOverlappingBertozziDiceCoefficientOverlapping------------------------------------Loading object filter: CloseToFarFromIdentityTypeGroup------------------------------------

Loading frame metric: M1.1M1.2M2.1M2.2M3.1------------------------------------Loading temporal metric: M3.2M3.4M3.5M3.6M3.7M3.8------------------------------------Loading event metric: M4.1M4.2M4.3M4.4------------------------------------

Page 13: ViSEvAl

ViSEvAl: two tools ViSEvAlGUI

Graphical user interface

Visualise detection and ground truth on the images

User can easily select parameters (e.g. distance, threshold,...)

Frame metrics results are computed in live

ViSEvAlEvaluation

Generate a .res file containing the results of the metrics

Frame, temporal and event metrics are computed

User can evaluate several experiments

Same configuration file for the both tools

Page 14: ViSEvAl

ViSEvAl: result file (.res) camera: 0Tracking result file: /user/bboulay/home/work/svnwork/sup/evaluation/ViSEvAl/example/vanaheim/res.xml1.xmlFusion result file: Event result file: Ground truth file: /user/bboulay/home/work/svnwork/sup/evaluation/ViSEvAl/example/vanaheim/Tornelli-2011-01 28T07_00_01_groups.xgtfCommon frames with ground-truth:Detection results:7978 7979 7980 7981 7983 7984 7985

*****

====================================================Metric M1.1.1====================================================Frame;Precision;Sensitivity 0;True Positive;False Positive;False Negative 0;Couples8004;0.500000;1.000000;1;1;0;(100;170;0.737438)8005;0.500000;1.000000;1;1;0;(100;170;0.721577)8006;0.500000;1.000000;1;1;0;(100;170;0.706809)8007;0.500000;1.000000;1;1;0;(100;170;0.713584)====================================================Final Results:Global results:Number of True Positives : 1789Number of False Positives : 1597Number of False Negatives 0: 2254Precision (mean by frame) : 0.523071Sensitivity 0 (mean by frame) : 0.477763Precision (global) : 0.528352Sensitivity 0 (global) : 0.442493---------------------------------------------Results for GT Object 2Number of True Positives : 0Number of False Positives : 0Number of False Negatives 0: 0Precision (global) : 0.000000Sensitivity 0 (global) : 0.000000---------------------------------------------

Page 15: ViSEvAl