video event recognition algorithm assessment evaluation workshop veraae etiseo – nice, may 10-11...

20
Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman IT/TASC Mubarak Shah, Niels da Vitoria Lobo - University of Central Florida Rama Chellappa, Dave Doermann - University of Maryland US Government Champions: Terrence Adams-NSA, John Garofolo, Rachel Bowers- NIST Advanced Research and Development Activity

Upload: hugh-bradford

Post on 13-Jan-2016

225 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Video Event Recognition Algorithm

Assessment Evaluation Workshop VERAAE

ETISEO – NICE, May 10-11 2005Dr. Sadiye Guler

Sadiye Guler - Northrop Grumman IT/TASCMubarak Shah, Niels da Vitoria Lobo - University of Central Florida Rama Chellappa, Dave Doermann - University of Maryland

US Government Champions:

Terrence Adams-NSA, John Garofolo, Rachel Bowers-NIST

Advanced Research and Development Activity

Page 2: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 2 TASC ProprietaryMay-05

Problem

Comparative study of Video Event Recognition (VER) algorithms to

assess applicability, usefulness and limitations of different

approaches

Motivation:- Several promising VER algorithms exist

- The algorithms have varying degrees of success with different types of

event detection

- No largely accepted criteria or data set (with ground truth) exist for VER

evaluation (few emerging studies..)

- The performance of VER algorithms is highly dependent on the results

of object detection and tracking, rendering fair comparison of just the

“event recognition” very difficult

Page 3: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 3 TASC ProprietaryMay-05

Workshop Goals

Produce realistic operational video event data set representing

scenarios for surveillance domain

Ground truth the video event data for VER and map to suitable Event

Ontology developed in previous workshops

Annotate the data set with object detection and tracking metadata that

serves the needs of all participating/expected event recognition

algorithms

Develop evaluation criteria and metrics for quantitative evaluation of

VER algorithms and software tools for evaluation

Assess different VER approaches for the applicability to operational

scenarios by their learning/explanation/ recognition capabilities

Page 4: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 4 TASC ProprietaryMay-05

VERAAE Approach

Evaluationmethodology

Video Event RecognitionAlgorithms

object metadataEvent

Ontology event metadata

TechnologyAssessment

VideoData

IC Event Scenarios

AnnotatedVideoData

Evaluationmethodology

Video Event RecognitionAlgorithms

object metadataEvent

Ontology event metadata

TechnologyAssessment

VideoData

Event Scenarios

AnnotatedVideoData

Page 5: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 5 TASC ProprietaryMay-05

Content Extraction

Event Detection

Event Recognition

Video data

Object features, tracks

Behaviors, actions, events

Abnormal and suspicious eventsTrends, correlations..

Signal

Raw Information

Semantics, Ontology

Knowledge, Intelligence

Video Event Recognition and VERAAE

VERAAE

Eva

luat

ion

Pro

vid

ed

Page 6: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 6 TASC ProprietaryMay-05

VERAAE Domain

VERAAE domain focus: surveillance

realistic scenarios of interest

Events and activities existing algorithms can detect

Realistic high level or complex events end-users want to detect

Workshop event scenarios

Data Set

Page 7: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 7 TASC ProprietaryMay-05

Data Set Planning

Primary factors that determine the data requirements:

- Fixed camera views, no PTZ

- Color, B&W and IR

- Realistic operational scenarios

About 10 events with varying complexity, at least 10 samples per event

- The collection parameters that address the functional capabilities of the algorithms

- Annotation will include the object track data required by the participating algorithms

(automatically and manually generated) e.g.:

Silhouettes of tracked objects

Bounding boxes and centroid of objects (U Maryland ViPER tool)

Object category e.g. vehicle, person, box, animal,…

Ground truth for video events will be generated using the event ontology work

- Frame numbers (time offsets) for Event Start and End, identified simple sub events

Page 8: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 8 TASC ProprietaryMay-05

Event Ontology (Event Taxonomy workshop)

Simple eventDomain independent action descriptorse.g. abandoning an object

Compound (complex or multi-threaded) eventMultiple simple events taking place in time and space constraints to achieve complex

activities.e.g. planting suspicious object, (if considered with below simple events

moving in the wrong direction

parked car at the curb-side

no one exiting parked car

getting in the car

Domain specific high level event- Semantic interpretation of events in a particular context, over multiple-views and

multiple data type eventse.g. sabotaging public facility

Page 9: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 9 TASC ProprietaryMay-05

Recognizing Surveillance Events

Surveillance Event types from the user’s point of view

Violation of some rule - wrong direction (in thru the out door)- abandoned object ( suitcase left unattended for t>T)

Suspicious or Interesting activity- non exit from a parked car- repeated visits to a store shelf

Abnormal activity- approaching several cars in the lot- several somewhat suspicious events in close proximity

Naturally represented by rules and constraints

Users can easily describe them

Highly context dependent, even context from other

camera views

Users can not easily describe but know when they see it

Naturally represented by probabilistic models and

learning

Users build a sense of “normalcy”

Page 10: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 10 TASC ProprietaryMay-05

Recognizing Surveillance Events

Knowing what can be detected we describe the events using

not only observable, but also detectable actions

Example: Shoplifting

Camera 1 in the store

•Repeated visit to an area

•Running in the store

Camera 2 in the parking lot

•Car in front of emergency exit

•No one exits from car

Page 11: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 11 TASC ProprietaryMay-05

Rule Based Event: Violation by an activity

constraint – car parked in the

driveway

Page 12: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 12 TASC ProprietaryMay-05

Rule Based Event: Violation by an object class

constraint

Page 13: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 13 TASC ProprietaryMay-05

Suspicious Event: “testing” the exclusion zone

Page 14: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 14 TASC ProprietaryMay-05

Abnormal Event: Vehicle casing the building

Page 15: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 15 TASC ProprietaryMay-05

Abnormal Event: Large Vehicle at the Gate

Page 16: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 16 TASC ProprietaryMay-05

Workshop Timeline

Evaluation tools development, Evaluation results, Final report

WorkshopDry-RunMeeting

October (3rd week)In Boston

Data, Evaluation criteria generation, distribution

Planning, invitations communications

FirstWorkshopMeeting

June 20/21With CVPR

Scenario FocusMeeting

EvaluationCriteria FocusMeeting

WorkshopFinalMeeting

May 05

Final report

December 05

This is a “seedling” workshop to investigate feasibility

Page 17: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 17 TASC ProprietaryMay-05

Workshop Approach

- First Workshop Meeting (2 days, June):Purpose:

- Workshop goals and vision;

- Presentation and determination of algorithms to

participate in the workshop;

- Presentation of example data sequences.

Outcome: - Outline of the data requirements (object tracking,

data exchange protocols etc.)

- Draft a rough set of evaluation criteria

- Solicit feedback on scenario complexity and

realism

Page 18: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 18 TASC ProprietaryMay-05

Workshop Approach

Evaluation Criteria Focus Meeting (2 days, July):

Purpose: to determine evaluation criteria best suited for VER.

Outcome: - Evaluation Criteria will be interactively developed in workshop

meetings leveraging Event Ontology, VEML and ETISEO workshop findings

Evaluation metrics at the component and system level will be defined based on

- Recognition rate

- Learning rate

Recall and Precision rates

True/False positives, True/False negatives and relevance of false detections

Event decomposition (based on the ontology defined sub event recognition rate)

Page 19: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 19 TASC ProprietaryMay-05

Workshop Approach

Workshop Dry-Run Meeting (2 days, October 05)

Purpose and Outcomes: • Participant’s feedback on processing the sample data sets.

• Evaluation tools and methodology presentation

• Evaluating the “evaluation criteria” and finalizing all metrics to be used.

• Planning of evaluation format

• Discussion of interpretation of results

Page 20: Video Event Recognition Algorithm Assessment Evaluation Workshop VERAAE ETISEO – NICE, May 10-11 2005 Dr. Sadiye Guler Sadiye Guler - Northrop Grumman

Page 20 TASC ProprietaryMay-05

Workshop Results

- Raw and annotated (with object detection and tracking data)

video sequences for realistic operational scenarios

- Event Recognition ground truth data based on surveillance

Event Ontology

- Re-usable and extendible Evaluation Criteria suitable for VER

- Software tools for event detection evaluation

- The groundwork for a formal VER evaluation process