introduction to handwritten signature verification
Post on 03-Feb-2022
11 Views
Preview:
TRANSCRIPT
Introduction to HandwrittenSignature Verification
Dave Fenton
University of Ottawa
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 1/53
Handwritten signature verificationPresentation overview (altered to remove allsignatures):
• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my research
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 2/53
Goal of HSV• To verify a person’s identity based on the way in
which he/she signs his/her name• Two types of system:
• Offline systems use static features (thesignature image)
• Online systems use dynamic features (thetime series)
• Written passwords are also under consideration
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 3/53
ApplicationsPrincipal application: reduce fraud in financialtransactions
• Cannot rely on sales staff to visually verifysignatures on credit card receipts
• Occasional acceptances of forgeries are allowable• Rejections of valid signatures may irritate
valuable customers• To date, used mostly for electronic signature of
business documents (hash function protectsdocument against alteration)
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 4/53
Document verification
• Apply hash function to document to generate hash code
• If signature is valid, encrypt hash with signer’s private key
• Recipient decodes received hash using public key
• If document has been altered, hashes don’t match
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 5/53
ApplicationsSecondary application: access security for buildingsor mobile computing devices
• For building security, it would not be tolerable toaccept forgeries
• HSV would have to be combined with on-sitesecurity staff or other biometric/password/PINsystems
• Already used on some laptops and PDAs
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 6/53
Naïve assumptions• A person signs his or her name consistently each
time• All signatures contain enough steady features to
be reliably verified• A forger cannot perfectly imitate the dynamic
features of a signature• All a user’s passwords can be replaced by his/her
signature
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 7/53
Example of consistency – staticPassword: “Prejunife”
21 July 2003 2 Sept 2003 24 Sep 2003
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 8/53
Example of consistency – dynamic
0 0.5 1 1.5 2 2.5 3−50
0
50Y Velocity (cm/s)
0 0.5 1 1.5 2 2.5 3−50
0
50
0 0.5 1 1.5 2 2.5 3−50
0
50
0 0.5 1 1.5 2 2.5 3−50
0
50
0 0.5 1 1.5 2 2.5 3−50
0
50
0 0.5 1 1.5 2 2.5 3−50
0
50
0 0.5 1 1.5 2 2.5 3−50
0
50
Time (normalized)
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 9/53
Example of inconsistency – staticPassword: “Ingusions”
12 Jan 2004 18 Mar 2004 27 Sep 2004
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 10/53
Example of inconsistency – dynamic
0 0.5 1 1.5 2 2.5−50
0
50Y Velocity (cm/s)
0 0.5 1 1.5 2 2.5−50
0
50
0 0.5 1 1.5 2 2.5−50
0
50
0 0.5 1 1.5 2 2.5−50
0
50
0 0.5 1 1.5 2 2.5−50
0
50
0 0.5 1 1.5 2 2.5−50
0
50
Time (normalized)
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 11/53
Example of forger ability – staticPassword: “Prejunife”
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 12/53
Example of forger ability – dynamic
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5−50
0
50Y Velocity (cm/s)
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5−50
0
50
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5−20
0
20
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5−50
0
50
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5−20
0
20
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5−50
0
50
Time (normalized)
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 13/53
More realistic assumptions• Most signers sign their names consistently• Most signatures contain enough steady features
to be reliably verified• Most forgers cannot reproduce a signature well
enough to defeat a good verifier• It is more difficult to forge both the static and
dynamic features of a signature than just thestatic features
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 14/53
Fallout from broken assumptions• It may not be possible to verify all signatures
reliably• For any signature, there will probably exist a
skilled forger who can forge it competently• Serious consideration must be given to passwords
• confidential• easily replaced if template compromised• can exert some control over the length
(quality of features)• canrequestthe signer to write legibly
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 15/53
Handwritten signature verificationPresentation overview:
• Goal, applications and assumptions
• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my research
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 16/53
Basics of biometricsPhysical v. behavioural biometrics:
• A physical biometric makes use of a fixedcharacteristic of the body (e.g. fingerprints, irispatterns, retina patterns, hand geometry, facialfeatures)
• The most accurate methods are usually perceivedas too intrusive.
• A behavioural biometric makes use of personalbehaviours which are assumed to be almostinvariant (e.g. voice, handwriting, typing, gait)
• Perceived as less intrusive, but less accurate thanphysical biometrics
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 17/53
Two stages• 1. Enrolment:
• a user’s signature characteristics are learnedfrom a small number of input samples. Theresulting information is called thetemplate.
• Typically, 3 – 5 signatures are used• 2. Verificationor recognition:
• For verification, acandidatesignature iscompared to the template of a single signer.
• For recognition, the candidate signature mustbe compared against many templates.
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 18/53
FRR and FAR• Two error rates are specified:
• TheFalse Rejection Rate(FRR) is the rate atwhich valid signatures are rejected.
• TheFalse Acceptance Rate(FAR) is the rateat which forged signatures are accepted asvalid.
• In many cases, low FRR implies high FAR, andvice-versa
• Current state of the art: FRR and FAR sum to2 – 5%. Actual numbers may be even worse!
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 19/53
ROC curves• Most verifiers have a single numerical output. If
the output level is above a decision threshold, thesignature is accepted as valid, otherwise it isrejected.
• In this case, the FRR and FAR can both beplotted against the decision threshold in areceiver operating characteristic(ROC) curve.
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 20/53
Example ROC curves
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Err
or
rate
Decision threshold
False Acceptance Rate (FAR)False Rejection Rate (FRR)
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 21/53
Example ROC curves
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Err
or
rate
Decision threshold
Equal Error Rate
False Acceptance Rate (FAR)False Rejection Rate (FRR)
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 22/53
Types of forgery• Random. A random forgery is simply another
person’s valid signature.• Simple. The forger spells the name correctly, but
writes in his own style.• Skilled, or Knowledgeable. The forger tries to
fully reproduce all the shapes and dynamics ofthe original signature. In this study, forgers areshown MPEG movies of the original signature.
• The training set consists of a few valid signaturesand many random forgeries.
• After training, the verifier is tested against all 3types of forgery.
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 23/53
Genuine samplesPassword: “Taximotels”
19 Sep 2003 16 Oct 2003 6 Nov 2003
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 24/53
ForgeriesPassword: “Taximotels”
Random Simple Knowledgeable
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 25/53
Motivations for research• Despite company claims, error rates are high, and
need improvement• For computing devices with pen inputs (PDAs,
tablet PCs), automatic signature verification is asensible technology
• Signatures are already a widely accepted meansof identification
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 26/53
Handwritten signature verificationPresentation overview:
• Goal, applications and assumptions• Basic concepts in biometrics
• Experimental setup• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my research
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 27/53
Data collection
• Signatures collected using an InterlinkElectronics ePad-ink (100 Hz samp freq)
• Captures X & Y position, pressure, time stamp• Data collection program written in C++• Data protected by PGPdisk
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 28/53
Data collection• Two levels of volunteer:
• Level 1: one signing session• Level 2: three signing sessions
• Each volunteer contributes:• 10 samples of genuine signature• 10 samples of genuine password• Simple forgeries of 2 signatures and 2
passwords• Knowledgeable forgeries of a signature and a
password
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 29/53
Enrolment
� � � � �� �
� � �
� �� � � � � � �
�� � �� � � � � �
� � �� � � �
� � � � � � � �
� � � � �
� � � � � � � �
� � � � � �
� � � � � � �
Acquire valid signatures:• Operational systems typically collect 3 – 5
genuine signatures; academic systems up to 20• Some use warping and interpolation schemes to
“create” extra valid signatures
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 31/53
Enrolment
� � � � �� �
� ! � "
# �$ % & � � � #
'� � (� ) � � # #
* + &� � &
, � & � � � #
- � ! � � &
, � & � � � #
. � � & �
& � / ( ! & �
Preprocess:
• Concatenate strokes into single time sequence
• Render invariant to:
• translation: subtract X & Y means
• rotation: force linear regression line to be horizontal
• scale: may normalize based on box size or signal power
• Normalization of duration is not carried out at this stage
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 32/53
Enrolment
0 1 2 3 45 6
7 8 9 4 :
; 4< = 8 > 3 5 6 ;
?5 6 @5 A 1 6 ; ;
B C >5 8 1 >
D 6 8 > 3 5 6 ;
E 6 9 6 1 >
D 6 8 > 3 5 6 ;
F 5 6 8 > 6
> 6 G @ 9 8 > 6
• May extract hundreds of features. Examples:
• Function features: time series such as velocity or
acceleration
• Dynamic discrete features: signing time, number of
strokes, pen-down distance, max velocity, mean
pressure, time to write longest stroke
• Static discrete features: bounding box, slant
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 33/53
Enrolment
H I J K LM N
O P Q L R
S LT U P V K M N S
WM N XM Y I N S S
Z [ VM P I V
\ N P V K M N S
] N Q N I V
\ N P V K M N S
M̂ N P V N
V N _ X Q P V N
Select features:
• With function features, typically use the same features for
each signer
• Not all discrete features are equally informative
• Cost used for feature selection is usually error rate;
classifier dependent
• Sequential forward/backward search
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 34/53
Enrolment
` a b c de f
g h i d j
k dl m h n c e f k
oe f pe q a f k k
r s ne h a n
t f h n c e f k
u f i f a n
t f h n c e f k
v e f h n f
n f w p i h n f
Create template:• Best performance so far: keep raw data of
multiple signatures• Bad practice, from security perspective• Template may also include list of features to
keep, best classifier to use, decision thresholds
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 35/53
Verification
x y z { |} ~
y � � � | � � � ~
� |� � � � { } ~
�} ~ �} � y ~ � �
� � �} � y �
� ~ � � { } ~ �
� � � � �} | � � �
�} � y ~ � �
� ~ y | � | � �
�} � y ~ � �
x y z { |} ~
� ~ � � � � � ~ � �
� ~ �} | ~ � ~
� ~ � � � � � ~
� ~ � ~ y � ~ �
� ~ � � { } ~ �
� �} ~ � � � � � �
� � � � | � { } � � | � � � �} � � ~ � ~} �
� { � � { �
• Initial steps of verification are same as enrolment• Only selected features need to be extracted
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 36/53
Verification
� � � � �� �
� � � � � � � �
¡ �¢ � � � � �
£� � ¤� ¥ � � ¡ ¡
¦ § � � �
¨ � � � � � ¡
© ¥ ª ¤ �� � ¡ ¥ �
¤� ¥ � � ¡ ¡
« � � � ¡ � ¥ �
¤� ¥ � � ¡ ¡
� � � � �� �
� ª ¤ ¬ � � «
® � � � � �̄
� ª ¤ ¬ � �
° � ¬ � � � �
¨ � � � � � ¡
± ²� � ¡ ² ¥ ¬ � ¡
© ¥ � �̈ ¢ � � � � ¥ � ¤ �� � ª � �� ¡
³ � ¤ �
• Template ID is acquired at same time ascandidate signature
• Designated by swipe card, PIN number, etc.• Template is usually stored in central database, but
may also be held on swipe card
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 37/53
Verification
µ́ ¶ · ¸¹ º
µ » ¼ ½̧ ½ » ¾ º
¿ ¸À ¼ » ¾ · ¹ º
Á¹ º ¹ à µ º ¿ ¿
Ä Å ¾¹ » µ ¾
Æ º » ¾ · ¹ º ¿
Ç Ã È Â »¹ ¸ ¿ Ã ¼
¹ à µ º ¿ ¿
É º µ ¸ ¿ ¸ Ã ¼
¹ à µ º ¿ ¿
µ́ ¶ · ¸¹ º
¾ º È Â Ê » ¾ º Ë É
Ì º ¾¹ ¸ º Í º
¾ º È Â Ê » ¾ º
Î º Ê º µ ¾ º ½
Æ º » ¾ · ¹ º ¿
Ï Ð¹ º ¿ Ð Ã Ê ½ ¿
Ç Ã ¼ Æ̧ À · ¹ » ¾̧ à ¼  »¹ » È º ¾ º¹ ¿
Ñ · ¾ Â · ¾
• Many different classifiers have been tried• May have to combine results from multiple
classifiers• Will be covered in more detail later
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 38/53
Handwritten signature verificationPresentation overview:
• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup
• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my research
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 39/53
Technical difficulties• Physiology of handwriting is not well understood• Signers not motivated to sign in a careful,
invariant manner• Training sets are sparse and badly imbalanced
(few valid signatures)• No knowledgeable forgeries available for training• Short, variable signatures often easily forged
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 40/53
Technical difficulties• No standard database of signatures and forgeries:
• every researcher uses a different set ofamateur forgeries
• some researchers test only against randomforgeries
• FAR is ill-defined; a low error rate may reflectthe forgers’ lack of skill rather than theverifier’s ability
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 41/53
Technical work-arounds• Disqualify certain signers during enrolment• Allow multiple signing attempts• Allow probationary period with relaxed
acceptance criteria (collect more trainingsignatures)
• Use passwords with a certain minimum length
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 42/53
Handwritten signature verificationPresentation overview:
• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV
• Past research and the current state ofthe art
• Overview of my research
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 43/53
Past research• Research has been underway for several decades.
Peak activity in mid-1980s to mid-1990s.• Early ideas are still good performers because they
have fewer control parameters• Between 30 – 60% of forgeries can be detected
by a basic time verifier
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 44/53
Time verifier
0 2 4 6 8 10 12 14 16 18 200
50
100
150
200Total Signing Time
Gen
uine
sig
natu
res
Sample mean = 3.58 Sample deviation = 1.53
0 2 4 6 8 10 12 14 16 18 200
50
100
150
Sim
ple
forg
erie
s Sample mean = 4.46 Sample deviation = 1.60
0 2 4 6 8 10 12 14 16 18 200
50
100
Kno
wle
dgea
ble
forg
erie
s
Sample mean = 6.11 Sample deviation = 2.74
Time (seconds)
Accepted by time verifierRejected by time verifier
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 45/53
Classifiers• Euclidean distance• weighted linear metrics• regional correlation• dynamic time warping (DTW)• neural networks• hidden Markov models (HMMs)• Bayesian belief net
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 46/53
Features used• Function features (usually position, velocity and
pressure)• Vectors of discrete features• Features calculated within sliding window (e.g.
centre of mass, torque)• Wavelet coefficients• LPC coefficients• Walsh transform of pen-up/pen-down signal• Pre-defined strokes (HMMs)
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 47/53
Classifier issues• Time alignment is important if using function
features• With few enrolment signatures, statistical
estimates are unreliable• Lack of training data is a severe problem for
learning machines• Data imbalance is also problematic
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 48/53
State of the art• In academic studies, more complicated verifiers
often achieve better results than simple verifiers• However, in field use, simple verifiers like DTW
often outperform everything else:• few adjustable parameters• with normalization, can set a single decision
threshold for all signers• Best verifier in public contest: DTW with
5-signature template
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 49/53
State of the art• Most sophisticated verifier: Plamondon’s
Sign@metric solution• discrete parametric verifier• physiological delta-lognormal verifier• static feature verifier• claimed performance: error rate of 0.0003%
among 86,500 people!!• Other companies that did not take part in public
contest: CIC, Cyber-SIGN, SoftPro, Wondernet
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 50/53
Handwritten signature verificationPresentation overview:
• Goal, applications and assumptions• Basic concepts in biometrics• Experimental setup• Technical difficulties posed by HSV• Past research and the current state of the art• Overview of my research
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 51/53
My research• Classifier comparison (DTW, NN, SVM,
weighted distance metric)• Techniques to mitigate imbalance of training data• Re-open debate on the use of passwords• Data analysis across signing sessions• Feature selection algorithm that gives preferential
treatment to features that are most likely to bestable
• Use of support vector machine
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 52/53
Questions?
To volunteer: please e-maild.fenton@ieee.org
SPOT presentation, University of Ottawa, 29 Oct 2004 – p. 53/53
top related