Table Of Content1
Analysis of Biometric Authentication Protocols in
the Blackbox Model
Koen Simoens, Julien Bringer, Herve´ Chabanne, and Stefaan Seys
January 3, 2011
1 Abstract—In this paper we analyze different biometric approach can lead to attacks against these proposals. We
1 authentication protocols considering an internal adversary. Our describein detailouranalysisofthreeexistingprotocols[19],
0 contribution takes place at two levels. On the one hand, we
[20],[22]andgiveargumentsonsomeothers[23],[24].Inthe
2 introduce a new comprehensive framework that encompasses
frameworkwepropose,westudyhowthepreviousattackscan
the various schemes we want to look at. On the other hand,
n we exhibit actual attacks on recent schemes such as those be formalized. We list all the possible existing attacks points
a
introduced at ACISP 2007, ACISP 2008, and SPIE 2010, and andthedifferentinternalentitiesthatcanleadtheattacks,and
J
some others. We follow a blackbox approach in which we we reveal the potential consequences.
3 consider components that perform operations on the biometric
Therestofthepaperisorganizedasfollows.Theframework
1 data they contain and where only the input/output behavior of
these components is analyzed. isdevelopedinSectionIIandintroducesthesystemandattack
] model.ThisisthenappliedtoexistingprotocolsinSectionIII,
R
Keywords—Biometrics, template protection, authentication, where detailed attacks are described. Section IV formalizes
C
protocols, blackbox security model, malicious adversaries these attacks and Section V concludes the paper.
.
s
c
II. FRAMEWORK
[ I. INTRODUCTION
In this section we present a framework that forms a basis
1 ALTHOUGH biometric template protection is a relatively
forthesecurityanalysisofbiometricauthenticationprotocols.
v young discipline, already over a decade of research has
9 Theframeworkmodelsa genericdistributedbiometricsystem
broughtmanyproposals.Methodstosecurebiometricdatacan
6 and the (internal) adversaries against such system. We define
be separatedin three levels. The first one is to have biometric
5 the roles of the different entities that are involved and their
2 data coming in a self-protected form. Many algorithms have potential attack goals. From these roles and attack goals we
. been proposed: quantization schemes [1], [2] for continuous
1 derive the requirements that are imposed on the data that are
biometrics; fuzzy extractors[3] and other fuzzy schemes [4]–
0 exchanged between the entities.
1 [6] for discrete biometrics; and cancellable biometrics [7]–
Biometric Notation: Two measurements of the same
1 [9]. The security of such template-level protection has been
biometric characteristic are never exactly the same. Because
: intensivelyanalyzed,e.g.,in [10]–[13]. Ona secondlevelone
v of this behavior, a biometric characteristic is modeled as a
i can use hardware to obtain secure systems, e.g., [14], [15].
X random variable B, with distribution pB over some range B.
Finally, at a third level advanced protocols can be developed
Asampleisdenotedasb.Twosamplesortemplatesarerelated
r to achieve biometric authentication protocols relying on ad-
a if they originate from the same characteristic. In practice, we
vanced cryptographic techniques such as Secure Multiparty
will say they are related if their mutual distance is less than
Computation,homomorphicencryptionorPrivateInformation
some threshold. Therefore, a distance function d is defined
Retrieval protocols [16, Ch. 9] [17]–[24].
over B and for each value in the range of d that is used as
The focus of our work is on this third level. In this work,
the threshold when comparingtwo samplesa false match rate
we analyze and attack some existing biometric authentication
(FMR) and a false non-match rate (FNMR) can be derived.
protocols.Wefollow[25]whereanattackagainstahardware-
Biometricvariablescanbecontinuousordiscretebutinthe
assisted secure architecture [15] is described. The work of
remainder of the paper we will assume that they are discrete.
[25] introduces a blackbox model that is taken back and
Note that the variables may consist of multiple components.
extended here. In this blackbox model, internal adversaries
Forexample,a samplemayconsistofabitstring,whichisthe
areconsidered.Theseadversariescaninteractwith thesystem
quantized version of a feature vector, and an other bitstring
byusing availableinput/outputof thedifferentfunctionalities.
that indicates erasures or unreliable components in the first
Moreover,the adversariesare maliciousin the sense that they
and thus act as a mask.
can deviate from the honest-but-curious classical behaviour,
which is most often assumed.
A. System Model
Ourcontributionsarethefollowing.Weextendtheblackbox
frameworkinitiated in [25] with the distributed system model Our system model follows to a large extent the model
of[19]in a waythatitcanhandledifferentexistingproposals defined by Bringer et al. [19], which was also used to define
for biometric authentication. We show how this blackbox new schemes in [20] and [26]. This model is motivated by
2
a separation-of-duties principle: the different roles for data on some threshold t. The information flows are as follows.
processingor data storage on a server are separatedinto three User U presents a biometric characteristic B that will
i i
distinctentities.Usingdistributedentitiesisabaselinetoavoid be sampled by the sensor S to produce a sample b′. When
i
onetocontrolallinformationanditisarealisticrepresentation operating in verification mode U will claim an identity ID :
i i
of how current biometric systems work in practice (cf. [27]).
System Entities: The different entities involved in the Ui −b−′i←−−B→i S or Ui −b−′i←−−B−i−,I−D→i S. (1)
system are a user U , a sensor S, an authentication server
i The sensor S forwards b′ and ID in some form to AS:
AS, a database DB and a matcher M. User U wishes to i i
i
authenticate to a particular service and has, therefore, regis- S −f−1−(b−′i→) AS or S −f−1(−b−′i)−,−g1−(−ID−i→) AS. (2)
tered his biometric data b during the enrollment procedure.
i
In the context of the service the user has been assigned an In general g (ID ) = ID but it can also be a mapping to
1 i i
identifier IDi, which only has meaning within this context. an encrypted value to hide IDi from AS. If applicable AS
The biometric reference data bi are stored by DB, who links resolves the mapping g1(IDi) to the identifier i and requests
the data to identifier i. The mapping from IDi to i is only reference data for one or more users from DB by sending at
knownby AS, if relevant.Notethatin some applicationsit is least one request g (b′,i) :
2 i
possible that the same user is registered for the same service
or in the same database with differentsamples, bi and bj, and AS −g−2−(b−′i−,→i) DB. (3)
different identities, i.e., ID 6= ID in the service context or
i j
Note that the function g does not necessarily use all the
i6=j in the database context. The property of not being able 2
information in its arguments, e.g., the fresh sample b′ may
to relate queries under these different identities is the identity i
be ignored.
privacy requirement as defined in [19].
Database DB provides AS with reference data for one or
During the authentication procedure the sensor S captures
a fresh biometric sample b′ from user U and forwards more users in some form. It is possible that DB returns the
i i
entire database, e.g., in case of identification:
the sample to AS. The authentication server AS manages
authorizations and controls access to the service. To make AS ←f−2(−{−bi−}−) DB. (4)
the authorization decision, AS will rely on the result of the
biometricverificationoridentificationprocedurethatiscarried The authentication server AS forwards the fresh sample b′
i
out by the matcher M. It is assumed that there is no direct and the reference data b in some combined form to M :
i
link between M and DB. As such, AS requests from DB
the reference data that are needed by M and forwards them AS −f−3−(b−′i−,{−bi−}→) M. (5)
to M. It is further assumed that the system accepts only
Note that AS has only f (b′) and f (b ) at his disposal to
biometric credentials. This means that the user provides his 1 i 2 i
compute f (b′,{b }).
biometric data and possibly his identity, but no user-specific 3 i i
ThematcherMperformsabiometriccomparisonprocedure
key,passwordortoken.Fig.1showstheparticipatingentities.
on the received b′ and {b } and returns the result to AS. The
FunctionalRequirements: Enrollmentofteninvolvesoff- i i
resultmaycontaindecisionsorscoresordifferentidentitiesbut
line procedures, like identity checks, and is typically carried
should at least be based on one distance calculation between
out under supervision of a security officer. Therefore, we
the fresh sample b′ and a reference b :
assumethatusersareenrolledproperlyandonlyauthentication i i
procedures are analyzed in our framework. A distinction has AS ←f−4(−d(−b−′i,−{b−i−})−) M. (6)
tobemadebetweenverificationandidentification.Verification
introduces a selection step, which implies that DB returns
Different data are stored by the different entities. The
only one of its references, namely the b that corresponds to
i database stores references {b }. The authentication service
i
the identifier i that is used in the context of the database.
stores the information needed to map g (ID ) to i, if appli-
1 i
The entity that does the mapping between ID and i, when
i cable. The matchers can store non-biometricverificationdata,
applicable,isgenerallyAS.Inidentificationmode,DBreturns
e.g., hashes of keys extracted from biometrics, or decryption
the entire set of references, in some protected form, to AS.
keysthatareusetorecovertheresultofcombiningsampleand
The database can then be combined with b′ and forwarded to
i reference. Also, the sensor can store key material to encrypt
M. The matcher M has to verify that b′ matches with one
i the fresh sample.
or a limited number of b in the received set of references or
i
that one of the matching references has index i.
B. Adversary Model
We define the minimal logical functionality to be provided
by our system entities in terms of generic information flows, AttackerClassification: Basedonthephysicalentrypoint
which are included in our model in Fig. 1. In this functional of an attack a distinction is made between two types of
model,werepresenttheresultofthebiometriccomparisonasa attackers: internal attackers are corrupted components in the
functionofthe distance d(b′,b ). Thisis a genericrepresenta- systemandexternalattackersareentitiesthatonlyhaveaccess
i i
tionof theactualcomparisonmethod:M canevaluatesimple to a communication channel. We will consider here only the
distances but also run more complex comparisons and will issue of an insider attacker. As a baseline, we make the
output either similarity measures or decisions that are based following assumption.
3
DB
g2(b′i,i)
A2
f2({bi})
Ui b′i,IDi S f1(b′i),g1(IDi) AS
f3(b′i,{bi})
(A4) A1
f4(d(b′i,{bi})) M
A3
Fig. 1. System model with indication of generic information flows and attack points Ai. User Ui’s biometric is sampled by sensor S. The sample b′i
andUi’sidentity areforwarded totheauthentication serverAS,whorequests thecorresponding reference bi fromdatabase DB.AS combines thesample
andthe reference andforwards theresult tomatcher M,whoperforms the actual comparison andreturns theresult toAS.Thesolidarrows represent the
messages exchanged between the system entities. Thedashed arrow represents the implicit feedback onthe authentication result tothe user Ui, i.e.,access
totherequested serviceisgrantedifthesamplematches thereference.
Assumption 1: The protocol ensures the security of the as an attacker in this model and the primary attack points are
scheme against any external attacker. AS, DB and M. Nonetheless, there may be inside attackers
Asthiscanbereachedbyclassicalsecurechanneltechniques, thatalsocontrolthebiometricinputstosomeextent.Wemodel
by an externalsecurity layer independentof the core protocol this with a secondary attack point at the sensor.
specification,westudyfurtheronlytheinternallayer.Notethat Assumption 3: The user U or the sensor S cannot be
i
the security of the scheme needs to be expressed in terms of attackers on their own but they can act as a secondary attack
specificattackgoals,whichwillbedefinedinthenextsection. point in combination with a primary attack point at AS, DB
A second distinction is made based on an attacker’s capa- orM.Ifthisthecaseanattackercanchoosetheinputsample
bilities. Passive attackers or honest-but-curious attackers are b′ through S and observe whether the authentication request
i
attackers that only eavesdrop the communications in which was successful through Ui.
theyareinvolvedandthatcanonlyobservethedatathatpasses Of course, the baseline assumptions have to be checked
throughthem. They always follow the protocolspecifications, before proceeding with a full analysis of the security of a
neverchangemessagesandnevergenerateadditionalcommu- scheme, but as such, they clarify what the big issues are that
nication.Activeormaliciousattackersareinternalcomponents may remain in state-of-the-art schemes. They also underline
that can also modify existing or real transactions passing what the hardest challenges are when designing a secure
through them and that can generate additional messages. We biometricauthenticationprotocol.Fig.1sumsupthedifferent
mainlyfocusonmaliciousinternalattackersandweformulate attack points we consider in our attack model.
the following additional assumption. Attack Goals: As noted above, the security of a scheme
Assumption 2: The protocol ensures the security of the is expressed in terms of specific attack goals or adversary
schemeagainsthonest-but-curiousentities,i.e.internalsystem objectives. Therefore, we define the following global attack
componentsthat always follow the protocolspecifications but goals.
eavesdrop internal communication. • Learn reference bi . In accordance to the security
We will explain in Section II-C how this has a direct impact definitions in [25] we define different gradations in the
on the propertiesof the differentfunctionalitiesof our model. information that an attacker may want to learn from b .
i
Finally,weputasidethethreatsontheuserorclientside,by Minimumleakagereferstotheminimuminformationthat
concentratingtheanalysisontheremoteserver’sside,i.e.,AS, allows, e.g., linking of references with high probability.
DB orM.Theinformationleakagefortheuserandtheclient Authorization leakage is the information that is needed
is generally only the authentication or identification result. toconstructasamplethatiswithindistancet,thesystem
They can, however, try to gain knowledge on the reference threshold, of the reference b . Full leakage gives full
i
data b by running queries with different b′, e.g., in some knowledgeofb .Whenaschemeisresistanttothisattack
i i i
kind of hill climbing attack. The difficulty can highly vary in all three gradations we say that it provides biometric
depending on the modalities, the threshold and the scenario. reference privacy.
A basic line of defense is to limit the number of requests, to • Learn sample b′i . The same gradations apply as in
ensure the aliveness of the biometric inputs provided by the the previous attack goal. We call the security property
user and to hide the result when applicable. Although it is associatedwiththisattackgoalbiometricsampleprivacy.
importantto implementsuch defense mechanisms, the threats • Trace users with different identities. This attack can
are inherent to any biometric authentication or identification be achieved when different references from the same
system. So we do not take the user or the sensor into account user, possiblycoming fromdifferentapplications,can be
4
TABLEI
RELEVANCEOFATTACKGOALSFORDIFFERENT(MALICIOUS)ENTITIESIN • AS should not learn bi hence f2 is at least one-way. To
THESYSTEMMODEL(?=ONLYRELEVANTIFTHESCHEMEUNDER prevent tracing users with different identities it may be
CONSIDERATIONWASDESIGNEDTOHIDEREFERENCESFROMDB;*= required that f is also semantically secure.
2
ONLYRELEVANTIFTHEPROTOCOLOPERATESINIDENTIFICATIONMODE
ORIFIDiANDiAREHIDDENFROMASINVERIFICATIONMODE). • Ifapplicable,AS shouldnotbeabletotraceUibylinking
queries on ID or i, and thus g should be semantically
i 1
Attackgoal AS DB M secure.
LLTreeaaacrrennUbbi′ii withdifferent identities VVV ??V VVV • nIfeeadpptolicbaeblset,orDedBinmapyronteoctteldeafronrmbi,uhsienngcesotmheebsiemwaonutlid-
TraceUi overdifferent queries V* V V cally secure function.
• DB may not learn b′i, hence g2 is one-way on its first
input. It should also be semantically secure to prevent
tracing U .
linked.A system thatis resistantto such attack is said to i
provide identity privacy [26]. • DB may not be able to link the queries at all, hence g2
should also be semantically secure on its second input.
• Ttoralicnekiunsgerqsueorvieesr, wdihfefethreerntanqouneyrmieisz.edThoirsnaottt,acbkaseredfeorns • M may not learn the individual bi or b′i and must not
be able to link references or queries from the same U ,
i, b or b′. The property of a system that prevents such i
i i hencef shouldbe semanticallysecure on tupleshb′,b i
attack is called transaction anonymity [26]. Note that an 3 i j
attacker that is able to learn b′ can automatically trace Now as we demand that M returns a result to AS that
i is a function (f ) of the distance d(b′,b ) while maintaining
users based on the learned sample. 4 i i
the confidentialityand the privacyof the data, this meansthat
The formulated attack goals may apply to the different
some operations must be malleable. Malleability refers to the
internal attackers as indicated by the different attack points.
propertyof some cryptosystemsthatan attacker can modifya
TherelevanceoftheattackgoalsisshowninTABLEI.Attack
ciphertextintoanothervalidciphertextthatistheencryptionof
goals can be generalizedfor combinationsof inside attackers,
somefunctionoftheoriginalmessage,butwithouttheattacker
e.g.,AS andM.Theyarerelevantforthecombinationifthey
knowingthis message. Dependingon the exact step when the
are relevantfor each attacker individually.As a counterexam-
combinationof b and b′ is realized,eitherg , f or f would
ple,learningb isnotalwaysrelevantforthecombinationAS- i i 2 2 3
i bemalleable.Inthefollowingsection,wewillshowtheimpact
DB. In someschemesitis assumedthatDB storesreferences
of this fundamental limitation and how it can be exploited to
in the clear so the attack “learn b ” becomes trivial. It is
i attack existing protocols.
important, however, that such schemes explicitly mention the
assumption that DB is fully trusted. It will become clear in
III. APPLICATION TOEXISTING CONSTRUCTIONS
the furthersections thatthe main focusof our work is on AS
whoisapowerfulattacker.Thiswayofthinkingisrathernew In this section, we begin to extend attacks that have been
and many protocols are not designed to be resistant to such introducedbyBringeretal.in[25]inthecontextofhardware
attacker. security to more complex cryptographic protocols that use
homomorphic encryption in Section III-A for a scheme by
For each attacker or combinationof attackers, and for each
relevant attack goal a security requirement can be defined, Bringer et al. [19] and in Section III-B for a scheme by
Barbosa et al. [20]. We then describe another kind of attacks
namely that the average success probability of the given
by looking at a scheme by Stoianov [22] in Section III-C.
attacker that mounts the given attack on the scheme should
Finally, we briefly discuss attacks on two other schemes [23],
be negligible in terms of some security parameter defined
[24] in SectionIII-D. Allschemesare describedwith thegoal
by the application. When analyzing the security of biometric
authentication protocols that include distributed entities, each to fit them directly into our model.
of these requirements should be checked individually.
A. Bringer et al. ACISP 2007
1) Description: In [19], Bringer et al. presented a new
C. Requirements on Data Flows
security model for biometric authentication protocols that
Comingbacktothefunctionalitiesinoursystemmodel(cf. separates the tasks of comparing, storing and authorizing an
Section II-A), we use the attack goals defined in TABLE I to authenticationrequestamongstdifferententities:afullytrusted
impose requirements on the data that are being exchanged. sensor S, an authentication server AS, a database DB and
• AS should not be able to learn b′i hence f1 is at least a matching service M. The goal was to prevent any of the
one-way, meaning that b′ should be unrecoverable from latterthreetolearntherelationbetweensomeidentityandthe
i
f (b′) with overwhelmingprobability.To preventtracing biometricfeaturesthatrelatetoit.Theirmodelformsthebasis
1 i
U over different queries, e.g., in identification mode, it of our current framework and in this model they presented a
i
couldalso be requiredthatf is semantically secure. We schemethatappliestheGoldwasser-Micalicryptosystem[28].
1
notethatsemanticsecurityisasecuritynotionthatmight Let E and D denote encryption and decryption, respec-
GM GM
betoo strongbutitensuresthatthe functionpreventsthe tively, and note that for any m,m′ ∈ {0,1} we have the ho-
minimumleakageasdescribedunderattackgoallearnb momorphic property D (E (m,pk)×E (m′,pk),sk)=
i GM GM GM
(Section II-B). m⊕m′. The scheme in [19] goes as follows.
5
During the enrollment phase, the user Ui registers at the • M receives bi⊕¯0 bitwise but permuted and records the
authentication server AS. He then gets an index i and a weight of b ⊕¯0;
i
pseudonym IDi. Let N denote the total number of records • S toggles a bit in the ¯0 vector in position x and sends it
in the system. Database DB receives and stores (b ,i) where to AS;
i
bi stands for Ui’s biometric template, a binary vector of • Mobservesthechangedweight(+1or-1)andlearnsthe
dimensionM,i.e.,b =(b ,b ,...,b ).Inthefollowing, bit at position x in b .
i i,1 i,2 i,M i
we suppose that i is also the index of the record b in the
i The adversary learns b in M queries.
i
database DB.
4) Discussion: What makes the first attack (A=AS) fea-
A key pair is generated for the system. Matcher M pos-
sible is that all bits are encrypted separately and that the
sesses the secret key sk. The public key pk is known by S,
cryptosystem is homomorphic and thus f (b′) and f (b ) are
AS and DB. The authentication server AS stores a table of 1 i 2 i
malleable (needed to create the encryptionof a zero-bit if the
relations (ID ,i) for i ∈ {1,...,N}. Database DB contains
i
publickeyisnotknown).Moreover,itisnotenforcedthatAS
the enrolled biometric data b ,...,b
1 N
combines the input from the sensor and from the database.
When user U wants to authenticate himself, the S will
i
send an encrypted sample E (b′,pk) and ID to AS. The To counteract this threat, one could require S to sign the
GM i i
authentication server AS will request the encryptedreference input and force DB to merge the input with references, in
this way DB combines the sample and the reference and AS
E (b ,pk) from DB and combine it with the encrypted
GM i
sample. Because of the homomorphic property, AS is able doesnotreceivethereferenceEGM(bi,pk)butthecombination
to obtain EGM(b′i⊕bi,pk). Note that the encryptionis bitwise ohfowtheevesr,amApSlecEaGnMs(tbil′il⊕leabrin,pkb′).aUndsinthgethb′e⊕prbev.ioAudsdaittitoanckal,
so AS will permute the M encryptions and forward these to i i i
M. Because M has the secret key sk, M can decrypt the measures have to be taken to prevent this, e.g., DB could be
required to sign E (b′ ⊕b ,pk), which will be verified by
permuted XOR-ed bits and compute the Hamming distance GM i i
M. Note that in the case where AS and DB collude, these
between the sample and the reference.
countermeasures are not sufficient anymore.
The security of this protocol is proved in [19] under the
assumption that all the entities in the system will not collude
and are honest-but-curious. It is this assumption that we
challenge in our framework, which leads to the following B. Barbosa et al. ACISP 2008
attack.
1) Description: In[20]Barbosaetal.presentedanewpro-
2) Authentication Server Adversary (A=AS): The follow-
tocolforbiometricauthentication,following[19](seeprevious
ing attack shows how a malicious authentication server AS
SectionIII-A).A notabledifferencebetweenthese two comes
can learn the enrolled biometric template b corresponding to
i
from the fact that [19] compares two biometric templates by
some identity ID . To do so the authentication server AS re-
i
their Hamming distance, enabling biometric authentication,
queststhetemplateb withoutrevealingID andreceivesfrom
i i
whereas [20] classifies one biometric template into different
DB the encrypted template that was stored during enrolment,
classes thanks to the SVM classifier (supportvector machine,
i.e., E (b ,pk)=hE (b ,pk),...,E (b ,pk)i.
GM i GM i,1 GM i,M
see [29] for details) leading to biometric identification. Bio-
Theattackconsistsofa bitwisesearchperformedbyAS in
metric templates are represented as features vector where
the encrypteddomain.First AS computesthe encryptionof a
zerobitEGM(0,pk).IfthepublickeyisnotknownbyAS, he each feature is an integer, i.e., bi = hbi,1,...,bi,ki ∈ Nk.
Barbosa et al. encrypt this vector, feature by feature, with
can take an encrypted bit of the template retrieved from DB
and compute E (b ,pk)0 =E (0,pk). Let the maximum the Paillier cryptosystem [30]. In particular, they exploit its
GM i,k GM
homomorphismpropertyto computeits SVM classifier (think
allowed Hamming distance be t.
Now AS will take the first encrypted bit E (b ,pk), of a sum of scalar products) in the encrypted domain.
GM i,1
repeatitt+1timesandaddM−t−1encryptionsofazerobit. However, as we explain further below in this section, as
Note that the ciphertext E (b ,pk) can be re-randomized the features are encrypted one by one, an adversary can do
GM i,1
so that it is impossible to detect that the duplicate ciphertexts something similar as the attack described in the previous
are “copies”. If b is one, the total Hamming distance as section (Section III-A).
i,1
computed by M will be t+1 and M will return NOK (not Let E (resp. D ) denote the encryption (resp.
Paillier Paillier
ok). If b is zero, the M will return OK. This process can decryption) with Paillier’s cryptosystem. This cryptosystem
i,1
be repeated for all bits of b , hence, AS can learn b bit by enjoysahomomorphicpropertywhichensuresthattheproduct
i i
bitin M queries. To furtherdisguise the attack AS can apply of two encrypted texts corresponds to the encryption of their
permutationsand add up to t encryptionsof one-bits to make sum: for m ,m ∈ Z we have that D (E (m ) ×
1 2 n Pailler Pailler 1
the query look genuine. E (m ))=m +m mod n.NotethatZ istheplaintext
Pailler 2 1 2 n
3) Matcher and Sensor Adversary (A=M+S): A bitwise space of the Paillier cryptosystem.
search attack similar to the previous attack can also be The SVM classifier takes as input U classes (or users)
considered in the case of an adversary made of the matcher and S samples per class, and determines support vectors
assisted by the sensor. The attack consists of the following SV and weights α for 1 ≤ i ≤ S and 1 ≤ j ≤ U.
i,j i,j
steps: Following the notation in [20], let v = (v ,...,v ) = b
1 k i
• S sends the encryption of ¯0=h0,...,0i; denote a freshly captured biometric sample. For this sample
6
the classifier computes be learned after log n queries. Hence, the referencedata of a
2
single user can be learned in klog n queries to the matcher.
S k 2
clS(jV)M(v)=Xαi,jXvl(SVi,j)l for j =1,...,U. (7) detweritmhintihnegpaerlmistutoaftiocna)n.diQduatietes.lIongipcaalr,ticauslatrh,ealtmhoatucghherthies
i=1 l=1
identifiers are permuted, he can detect if related inputs are
With this vector cl (v), it is possible to determine which
SVM used, to trace the user whole database (with a known input)
classisthemostlikelyforvortorejectit.Thesupportvectors
3) Discussion: As in Section III-A this attack succeeds
SV and the weight coefficients α are the references that
i,j i,j becausefeaturesareencryptedseparatelyandthereisnocheck
are stored by DB.
toseeifthesampleandthereferencewerereallymerged.The
Briefly, the scheme of Barbosa et al. works as follows:
same attack can in principle be used to learn any information
1) The sensor S captures a fresh biometric sample and
about the input sample.
encrypts each of the features of its template v =
(v ,...,v ) with Paillier’s cryptosystem and sends C. Stoianov SPIE 2010
1 k
it to the authentication server AS. Let auth =
1) Description: In [22], Stoianov introduces several au-
(E (v ),...,E (v )).
Paillier 1 Paillier k thenticationschemesrelyingontheBlum-Blum-Shubpseudo-
2) The database DB computes an encrypted version of random generator. We focus on the database setting from the
the SVM classifier for this biometric data: c =
j paper (cf. Section 7 of [22]. In this setting there is a service
QSi=1(Qkl=1[authj]l[SVi,j]l)αi,j where [.]l denotes the providerSPthatperformstheverification.Consistentwithour
lth component of a tuple. This cj corresponds to the model, we will call this entity the matcher M. Sample and
encryption of the cl(j) with Paillier’s cryptosystem as reference are combined before being sent to M and although
SVM
definedabove.Thedatabasereturnsthevaluesc toAS. this is not explicitly mentioned in [22] we designate this
j
3) The authentication server AS scrambles the values cj functionality to the authentication server AS in our model.
and forwards them to M.1. In the schemes of [22], the biometric data b are binarized
4) The matcher M, using the private key of the system, and are combined with a random codeword c coming from
decryptsthe componentsof the SVM classifier and per- an error-correcting code to form a secure sketch or code
forms the classification of v. The classification returns offset b ⊕ c where ⊕ stands for the exclusive-or (XOR).
the class for which the value cl(j) is maximal. When a new capture b′ is made, whenever b′ is close to b
SVM
5) Based on the output of M, AS determines the real (using the Hamming distance) it is possible to recover c from
identity of U (in case of non-rejection). b⊕b′⊕c using error correction. This technique is known as
i
2) Authentication Server Adversary (A=AS): The fol- the fuzzy commitment scheme of Juels and Wattenberg [5].
lowing attack shows how a malicious AS can recover the An additional layer of protection is added by encrypting the
biometric references. In this scheme, the biometric reference secure sketch using Blum-Goldwasser.
data that are stored by DB, i.e., the support vectors SV The Blum-Blum-Shub pseudo-random generator [31] is a
i,j
andtheweightcoefficientsα ,representhyperplanesthatare tool used in the Blum-Goldwasser asymmetric cryptosystem
i,j
used for classification. These k-dimensional hyperplanes are [32]. From a seed x0 and a public key, a pseudo-random
expressed as linear combinations of enrolment samples (the sequence S is generated. In the following, S is XOR-ed to
supportvectors).We will show how these hyperplanescan be the biometric data to be encrypted. By doing so, the state of
recovered dimension by dimension. the pseudo-random generator is updated to xt+1. From xt+1
Let us rewrite (7) as and the private key, the sequence S can be recomputed.
InthissystemofStoianov,Mgeneratesthekeysandsends
S S
(j) the public key to S. On enrollment
clSVM(v)=v1Xαi,j(SVi,j)1 +···+vkXαi,j(SVi,j)k 1) Sensor S computes (S⊕b⊕c,x ) where:
i=1 i=1 t+1
=v1βj,1+···+vkβj,k. • Sample b is the freshly captured biometric data,
• String S is a pseudo-random sequence and xt+1 is
Bysendingavectorv =h1,0,...,0itoDB,AS willretrieve
the state of the Blum-Blum-Shub pseudo-random
the encryption of β = S α (SV ) for each user,
j,1 Pi=1 i,j i,j 1 generator as described above, and
indexed by j, in the database.
• c is a random codeword which makes the secure
Instead of sending all c = E (β ) to M, only one
j Paillier j,1 sketch c⊕b;
value will be kept by AS, e.g., c = E (β ). The
1 Paillier 1,1 2) Sensor S sends S⊕b⊕c to DB;
authentication server will set c = E (x) for some value
2 Paillier 3) Sensor S sends x and H(c) to M where H is a
x ∈ Z and all other c = E (0). The matcher M will t+1
n j Paillier cryptographic hash function.
return the index of the class with the greatest value, which is
Using the private key, M computes S from x and stores
1 if β >x and 2 if β <x. t+1
1,1 1,1 it along H(c). Periodically, M (resp. DB) updates S (resp.
The initial value of x = n/2. If β > x then AS will
1,1 S⊕b⊕c) to Sˇ (resp. Sˇ⊕c⊕b) with an independent stream
adjustxton/2+n/4,otherwisex=n/2−n/4.Byrepeating
cipher.
thisprocessandadjustingthevaluex,theexactvalueβ can
1,1 DuringauthenticationsensorS receivesanewsampleb′and
1In[20],theentitythatmakesthedecisionisreferedtoastheverification forwards (S′ ⊕b′,x′t+1) to AS, where S′ is a new pseudo-
server.Tobeconsistentwithourmodelwecontinuetousethetermmatcher. random sequence. It is assumed that there is some sort of
7
authenticationserverAS thatretrievesSˇ⊕c⊕bfromDB and against the database tracing users over different queries, i.e.,
merges it with S′⊕b′. Finally S′⊕b′⊕Sˇ⊕b⊕c and x′ by tracking Sˇ+c+b lookups.
t+1
aresenttoM.UsingtheprivatekeyMrecoversS′.FromS′ We note that the matcher M is very powerful because he
and Sˇ, M computes c⊕b⊕b′, tries to decode it and verifies knows the secret key, which allows computing S′, and Sˇ. As
the consistency of the result with H(c). soon as M colludes with one of the other entities he is able
2) Matcher Adversary (A=M): Let M be the primary to learn everything from a genuine match or a false accept.
attacker.ItisinherenttotheschemethatMcanalwaystracea
validuseroverdifferentqueriesbylookingatthecodewordc, D. Other Schemes
whichisrevealedafterasuccessfulauthentication.Depending
Due to the generic design of our model, several other
on the entity that colludes with M additional attacks can be
schemes in the literature fit our model. Nevertheless, as they
deviced.
are not always designed with the same entities, an adaptation
If M and DB collude (A=M+DB) they learn the sketch
might be required. Some others are not compatible at all; for
c⊕b. This implies that they can immediatelytrace users with
instance those for which the security relies on a user-secret
different identities following the linkability attack based on
key stored on the user side. We now present a brief overview
the decodingof the sum of two sketches as described in [11].
of the schemes [23], [24] when analyzed in our model.
From a genuine match, M learns c and thus also b.
ACM MMSec 2010 eSketch: This scheme of Failla et
If M and S collude (A=M+S) they control and always
al. is described in [23] following a client-server model. The
learn the input sample b′. By setting b′ = 0 they learn c⊕b
client corresponds to AS and the server can be logically
from a single query. If a successful authentication occurred,
separated into DB and M. The goal of the scheme is to
the adversary learns everything.
provideanonymousidentification.TheDB storesdataderived
If M and AS collude (A=M+AS) they always learn the
fromthebiometricreference,inparticularsecuresketches,and
input sample b′. They can learn the sketch c ⊕ b for any
partofthedataisencryptedviathePailliercryptosystemwith
referenceandthustraceuserswithdifferentidentitiesasinthe
thecorrespondingsecretkeyownedbyAS.Theidentification
case (A=M+DB). They learn the reference b after successful
queryisimplementedthroughdifferentexchangesbetweenthe
authentication.
entities and at one step the same randomnessis used to mask
exhaustive search block by block in case of an accept to
allthedifferentreferencetemplatesandthemaskedvaluesare
reconstruct b + b’...
sent to AS. Consequently, an authentication server adversary
3) Authentication Server Adversary (A=AS): In the cur- (A = AS) learns the whole database after one successful
rent scheme, bits are not encrypted bit per bit independently.
authentication,becausetheclient(ASinourmodel)knowsthe
Moreover,they are masked with streams generatedvia Blum-
Paillier secret keys. If the adversary consists of the database
Blum-Shub and a codeword so attacks as in Sections III-A
and the matcher (A = DB + M), it is also possible to learn
and III-B are no longer possible. Nevertheless, there is still a
thereferencetemplate,whichissupposedtobehiddenforthe
binary structure that AS may exploit. server.
Assume that AS knows S′ ⊕ b′ that leads to a positive ACMMMSec2010SecureMultipartyComputation: This
decision, i.e., M accepts b′ because d(b,b′) ≤ t. Then AS scheme of Raimondo et al. [24] is also described following
can start from S′⊕b′ and add progressivelysome errorsuntil a client-server model with secure multiparty computation
he reaches a negative result. Then, he backtracks one step by between them to achieve an identification scenario (authen-
decreasing the error weight by one to come back to the last tication scenario as well, cf. [24, Fig. 3]). The scheme is
positive result. This gives AS an encrypted template S′⊕b′′. not made to be resistant against malicious adversaries.Fitting
Consider now the vector S′⊕b′′⊕Sˇ⊕c⊕b and replace the it in our model, we have AS which obtains the result and
first bits (say of small length l) by a l bits vector x. DB which stores all the references in clear; AS sends an
• Forallpossiblevaluesofx,AS sendstheresultingvector encrypted (via Paillier) query to DB; DB sends back to AS
(thefirstblockischangedbythevaluex)toMwhoacts all the entries combined with the query (this gives in fact a
as a decision oracle. database containing all the Euclidean distances) and then AS
• Ifseveralvaluesgiveapositiveresult,thenAS increases and M interact (secure multiparty protocol) to output the list
the errors on all but the first block. ofidentifiersforwhichthedistanceisbelowathreshold.Here
• Thisisrepeateduntilonlyonevalueofxgivesapositive again encryption of the query is made block by block, so a
result. similar strategyas in Section III-Bis possible when A = AS.
• Whenthisstepisreached,AS hasfoundthevaluexwith An adversaryA = DB + M can also tamper the inputsto the
no errors, i.e., he learns the first block of S′⊕Sˇ⊕c. last part of the protocol to learn information about the query.
• AS proceeds to the next block.
Followingthisstrategy,itisfeasibletorecoverallthebitsof IV. FORMALIZATION OF ATTACK SCENARIOS
b⊕b′.IfAS colludeswithS,hecanretrievethefullreference The goal of this section is to explore some generic attack
template b as soon as S knows one sample that is close to b. scenarios that can be used for analyzing actual protocol
We call this attack a center search (cf. Section IV below). specifications. These attacks are presented in the framework
4) Discussion: In a way similar to the inherenttraceability as described in Section II and generalize the attacks of the
ofusersbyM,therearenomechanismsdescribedthatprotect previous Section III. As explained in Section II we only
8
consider malicious internal attackers, i.e., AS, DB, M and should not reveal to AS whether the inputs are the same or
combinationsof these entities. User U and the sensor S have not. This decomposition of references is used in the scheme
i
been excluded as individual attackers. analyzed in Section III-A and the following attack applies to
it.
Suppose that AS is able to generate a value that is valid
A. Blackbox Attack Model
output of fˆ when the two component inputs b and b′
3 i,j i,j
Thedifferentattacksthatcanbecarriedoutbytheattackers
are the same and similarly when they are not the same, e.g.,
are modeled as blackbox attacks, following recent results
the output is the encryption of one or zero. If AS can also
from [25]. This allows us to clearly specify the focus of
compute f , then AS can fully reconstruct b . To do so AS
1 i
the attack. Our blackbox-attackmodel consists of two logical choose the first componentof b′ at random, combines it with
i
entities:
thefirstcomponentofb andsendstheresulttoM.Theother
i
1) The attacker, i.e., one or more system entities. These components that are sent to M are such that t of those are
entities are fully under control of the attacker: internal an output of fˆ that reflects differentinputs and the n−t−1
3
data are known, messages can be modified and addi- remaining components are outputs that reflect equal inputs.
tional transactions can be generated. Note that t is the comparison threshold. If the guess of AS
2) The target or the blackbox, i.e., the combination of all forthefirstcomponentiscorrectthenMwillreturnapositive
other system entities. The attack is focused on the data match. Otherwise the guess is wrong and AS can try again.
that are protected by the system componentswithin the This process can be repeated until all components of b are
i
blackbox. recovered. For binary samples, this requires n queries to M
The target is modeled as a blackbox because the attacker and 1 query to DB.
can only observe the input-output behavior of the box. This As shown in Section III-B a similar attack can be executed
adequatelyreflectsremoteprotocolswhereonlythecommuni- if the biometricdata are representedas real-valuedor integer-
cationcanbeseenbytheattacker.Nodetailsareknownabout valued feature vectors. However, more queries might be re-
theinternalstateoftheremotecomponents.Duringtheattack, quired to get an accurate result.
the attacker will “tweak” inputsto the blackbox.However,all Center Search Attack Using S: In this attack, S is also
communication must comply with the protocol specification. compromisedandunderthecontroloftheattacker.Theattack
Anymessagesthataremalformedorthataresentinthewrong goalis to learn the full referencebi from a close sample. The
order are rejected by the blackbox. input sample is obviously always known to AS and S. Thus
Itshouldbe notedthatthereare casesin whichtheattacker atsomepointintimeUi willpresentasampleb′i thatmatches
cannot generate additional transactions because he has to reference bi. This sample will lie at some distance from the
follow the protocol specifications. E.g., if DB is attacking reference. In the case where biometrics are represented as
he has to wait until a request is received from AS. When binarystringsandthesystemimplementsa hammingdistance
analyzing protocols it should be assumed that this will occur matcher the attacker can recover the exact bi as follows.
withareasonablefrequency.Ifrelevant,attackcomplexitycan Thesensorflipsthefirstbitofb′ andsendsthenewsample
i
be expressed in function of this frequency. Similarly, if the to AS who performs the whole authentication procedure. If
attacker is AS, he receives inputs from S and communicates the authenticationsucceeds S flips the second bit, leaving the
with DB and M. In this case we exclude U and S from firstbitalso flipped,andsendsthe sampleto AS whofollows
i
the blackbox.It should be assumed, though, that a number of theprocedureagain.Thiscontinuesuntilthesamplenolonger
inputs from S is available to AS. This does not necessarily matches bi. Then the sensor starts again by restoring the first
implythatS isundercontrolofAS.Theanalysisoftheattack bit of the sample that is no longer accepted and forwards it
can take into account the amount of data that is available. to AS. If it gets accepted this means that the first bit of the
We willnowconsidera numberofpossibleadversariesand original sample b′i was the same as the first bit of bi. If not,
blackbox attacks in our framework. thenthefirstbitsweredifferent.Onebyonethebitsinb′ that
i
aredifferentfromthosein b canbecorrected.Thistechnique
i
was demonstrated in Section III-C.
B. Attacker = AS
We call this the center search attack because we start from
Decomposed Reference Attack: Let’s assume that only a sample that lies in a sphere with radius t, the matching
one reference bi is returned by DB. The goal of this attack threshold, and the reference as center point. The goal of this
is to learn bi. Biometric samples or references are often attack is to move the sample to the center of the sphere. The
represented as a “string”, i.e., a concatenation (let k denote worst-case complexity of this attack for bitstrings of length n
concatenation) of (binary) symbols. Let’s assume that f2(bi) is the greatest of 2∗t+n and 4t. The complexity is 2∗t+n
is the concatenation of a subfunction fˆ2 that is applied on iftherearet−1bit-errorsinthebeginningandoneattheend
eachof the n componentsb′ ofb′ individually.IfAS hasto of the string. The first t−1 errors get corrected by flipping
i,j i
combinef2(bi) and f1(b′i) withoutknowingeither the sample them and t additional bits need to be flipped to invalidate the
or the reference, it is likely that f1 and f3 will also be the sample. Locating the bit-errors requires searching till the end
concatenation of component-wise applied subfunctions, i.e., of the string where the last error is. The complexity is 4t if
f3(bi,b′i) = fˆ3(bi,1,b′i,1)k...kfˆ3(bi,n,b′i,n). Note that in our there are t−1 correct bits followed by t wrong bits. So 2t
model AS can generate the value fˆ(b ,b′ ) but this value flips are needed before the queries no longer match and then
3 i,j i,j
9
2t positions need to be searched. In practice, t ≤ n/2 and E. Attacker = AS and M
thus the worst-case complexity is 2t+n.
The attack goal with the highest impact is to learn the
False-Acceptance-RateAttack: Afalseacceptanceoccurs
reference b from the database. Depending on how the M
i
if a sample, not coming from U , is close enough to b to
i i implements its functionality this can be a very powerful
be recognized by the system as a sample coming from U .
i attacker, e.g., if M possesses decryption keys for encrypted
The name comes from the fact that an attacker can take a
samples/templatesas was the case in the schemesanalyzedin
largeexistingdatabaseofsamplesandfeedthattoabiometric
Sections III-A, III-B and III-C.
authentication system. Due to the inherent false acceptance
rates, there will be a sample in the attacker’s database that
matches the reference in the system with high probability. F. Attacker = DB and M
Thegoalofthisattackistolearnbi fromamatchingsample In this combination of attackers, DB will manipulate its
that is unknown to the attacker. This attack combines ideas
output so that it can be of use to the M. The relevant attack
from the previous attacks. The attacker is AS, not including goals are to learn b′ and to trace U .
i i
S, and AS does not know how to compute an output of f
3
thatreflectsequal(ordifferent)inputs.Itisassumed,however,
thattheattackercanreplacethecomponentsofb′ inthevalue G. Attacker = AS and DB and M
i
he received from S, i.e., f1(b′i). This is definitely the case if Inthis particularcase, the attackeris a combinationofAS,
f1 is a concatenationof subfunctionsand if AS can compute DB and M, and the goal is to learn b′i. If the reference bi
such subfunction fˆ1. is not stored in the clear by the database, the attacker may
The actualattack then proceedsas follow.The attackerAS want to learn b also. Tracing U is almost trivial because the
i i
waits until a genuine user presents a valid sample. The attack attacker can perform a search (identification) on the database
is similar as in the center-search attack, only now AS will DB. The attack goals are easily reached if the data can be
notflipbitsbutsimplyreplacethemwitha knownvalue,e.g., decomposed as explained in the attacks of AS.
one.Hewill dothisuntilthesample nolongermatches.Then
AS already knows that the last bit he replaced was not one
V. CONCLUSION
andhewillrestorethatbit.Thenhecontinuestosubstitutethe
bits one by, carefully observing whether the sample matches Biometric authentication protocols that are found in the
or not and learning all the bits. The first bits that were flip to literatureareusuallydesignedinthehonest-but-curiousmodel
invalidatethesamplecanbelearnedbysimplyrestoringthem. assuming that there are no malicious insider adversaries. In
this paper, we have challenged that assumption and shown
C. Attacker = DB or M how some existing protocols are not secure against such
adversaries.Such analysisis extremelyrelevantin the context
The attacker is the database DB or the matcher M who
of independent database providers. Much attention was given
communicate with the authentication service AS only. The
to an authentication server attacker, which is a central and
attackers cannot achieve any of the attack goals individually
powerful entity in our model. To prevent the attacks that
because their blackboxes give output, which cannot be influ-
were presented, stronger enforcement of the protocol design
enced by the attacker, before receiving input. If these entities
is needed: many attacks succeed because transactions can be
do not collude with other entities they are simply passive
duplicated or manipulated.
attackersandbyAssumption2theycannotmountanyattacks.
Including the sensor S: If the sensor colludes with the
databaseor the matcher,some attacksare trivial:the provided REFERENCES
sample is knownand thus it is also easy to trace a user based
[1] J.-P.M.G.LinnartzandP.Tuyls,“Newshieldingfunctions toenhance
on the provided sample or identity.
privacy and prevent misuse of biometric templates,” in AVBPA, ser.
ApowerfulattackeristhecombinationoftheMandtheS, LNCS,J.KittlerandM.S.Nixon,Eds.,vol.2688. Springer,2003,pp.
as was shown in Section III-C. Because the sensor can send 393–402.
[2] I. Buhan, J. Doumen, P. H. Hartel, and R. N. J. Veldhuis, “Fuzzy ex-
any input and any identity, the attacker does not have to wait
tractorsforcontinuousdistributions,”inASIACCS,F.BaoandS.Miller,
for a matching sample. The same center-search attack can be Eds. ACM,2007,pp.353–355.
performed as in the case where AS and S are the attacker. [3] Y.Dodis,L.Reyzin,andA.Smith,“Fuzzyextractors:Howtogenerate
strongkeysfrombiometricsandothernoisydata,”inAdvancesinCryp-
tology -EUROCRYPT2004, ser. LNCS, C. Cachin and J. Camenisch,
D. Attacker = AS and DB
Eds.,vol.3027. Springer, 2004,pp.523–540.
Theattackers(AS andDB) receivefresh inputfromS and [4] G. Davida, Y. Frankel, and B. Matt, “On enabling secure applications
through off-line biometric identification,” Proc. of the IEEE Symp. on
communicate with the matcher. They can search the entire
Security andPrivacy–S&P’98,pp.148–157, May1998.
databaseandturntoidentificationalthoughtheprotocolcould [5] A. Juels and M. Wattenberg, “A fuzzy commitment scheme,” in CCS
be designed to operate in verification mode. ’99: Proc. of the 6th ACM Conf. on Computer and Communications
Security. NewYork,NY,USA:ACMPress,1999,pp.28–36.
The input sample b′ can be learned in the same way as the
i [6] A.Juels andM.Sudan, “A fuzzy vault scheme,” inProc.ofIEEEInt.
bi was learned in the attacks of AS, if the same conditions Symp.onInformation Theory, Lausanne, Switzerland, A.Lapidoth and
hold.Then,dependingontheimplementation,theattackercan E.Teletar, Eds. IEEEPress,2002,p.408.
[7] N.K.Ratha, J.H.Connell, andR.M. Bolle, “Enhancing security and
learn the entire database because DB will return any b and
i privacy in biometrics-based authentication systems,” IBM Systems J.,
AS will manipulate it until all bits are known. vol.40,no.3,pp.614–634, 2001.
10
[8] N.Ratha,J.Connell,R.Bolle,andS.Chikkerur,“Cancelablebiometrics: [30] P.Paillier, “Public-keycryptosystemsbasedoncompositedegreeresid-
Acasestudyinfingerprints,”inPatternRecognition,2006.ICPR2006. uosity classes,” in EUROCRYPT, ser. LNCS, J. Stern, Ed., vol. 1592.
18thInt.Conf.on,vol.4,2006,pp.370–373. Springer, 1999,pp.223–238.
[9] N.K.Ratha,S.Chikkerur,J.H.Connell,andR.M.Bolle,“Generating [31] L. Blum, M. Blum, and M. Shub, “A simple unpredictable pseudo-
cancelable fingerprint templates,” IEEE Trans. Pattern Anal. Mach. random numbergenerator,” SIAMJ. Comput., vol. 15, no. 2,pp. 364–
Intell., vol.29,no.4,pp.561–572,2007. 383,1986.
[10] X.Boyen,“Reusablecryptographicfuzzyextractors,”inCCS’04:Proc. [32] M. Blum and S. Goldwasser, “An efficient probabilistic public-key
of the 11th ACM Conf. on Computer and Communications Security. encryption scheme which hides all partial information,” in CRYPTO,
NewYork,NY,USA:ACM,2004,pp.82–91. ser.LNCS,G.BlakleyandD.Chaum,Eds.,vol.196. Springer,1984,
[11] K.Simoens,P.Tuyls,andB.Preneel,“Privacyweaknessesinbiometric pp.289–302.
sketches,”in200930thIEEESymp.onSecurityandPrivacy,May2009,
pp.188–203.
[12] I. Buhan, J. Breebaart, J. Guajardo, K. de Groot, E. Kelkboom, and
T. Akkermans, “A quantitative analysis of crossmatching resilience
for a continuous-domain biometric encryption technique,” in First Int.
WorkshoponSignalProcessingintheEncryptEdDomain,SPEED2009,
2009.
[13] A. Nagar and A. Jain, “On the security of non-invertible fingerprint
templatetransforms,”inInformationForensicsandSecurity,2009.WIFS
2009.FirstIEEEInt.Workshopon,2009,pp.81–85.
[14] L. Rila and C. J. Mitchell, “Security protocols for biometrics-based
cardholderauthentication insmartcards,”inACNS,ser.LNCS,J.Zhou,
M.Yung,andY.Han,Eds.,vol.2846. Springer, 2003,pp.254–264.
[15] J.Bringer,H.Chabanne, T.A.M.Kevenaar, andB.Kindarji,“Extend-
ing match-on-card to local biometric identification,” in Biometric ID
ManagementandMultimodalCommunication,BioID-Multicomm 2009,
ser. LNCS, J. Fierrez, J. Ortega-Garcia, A. Esposito, A. Drygajlo, and
M.Faundez-Zanuy, Eds.,vol.5707. Springer, 2009,pp.178–186.
[16] P.Tuyls,B.Sˇkoric´,andT.Kevenaar,Eds.,SecuritywithNoisyData:Pri-
vateBiometrics,SecureKeyStorageandAnti-Counterfeiting. Springer-
VerlagLondon,2007.
[17] B. Schoenmakers and P. Tuyls, Private Profile Matching, ser. Philips
Research Book Series. Springer-Verlag, New York, 2006, vol. 7, pp.
259–272.
[18] J.Bringer,H.Chabanne,D.Pointcheval,andQ.Tang,“Extendedprivate
informationretrievalanditsapplicationinbiometricsauthentications,”in
CANS,ser.LNCS,F.Bao,S.Ling,T.Okamoto,H.Wang,andC.Xing,
Eds.,vol.4856. Springer,2007,pp.175–193.
[19] J.Bringer, H.Chabanne, M.Izabache`ne, D.Pointcheval, Q.Tang,and
S. Zimmer, “An application of the Goldwasser-Micali cryptosystem to
biometricauthentication,”inACISP,ser.LNCS,J.Pieprzyk,H.Ghodosi,
andE.Dawson,Eds.,vol.4586. Springer, 2007,pp.96–106.
[20] M. Barbosa, T. Brouard, S. Cauchie, and S. M. de Sousa, “Secure
biometricauthenticationwithimprovedaccuracy,”inACISP,ser.LNCS,
Y.Mu,W.Susilo,andJ.Seberry,Eds.,vol.5107. Springer,2008,pp.
21–36.
[21] J.BringerandH.Chabanne,“Anauthenticationprotocolwithencrypted
biometric data,” in AFRICACRYPT,ser. LNCS, S. Vaudenay, Ed.,vol.
5023. Springer, 2008,pp.109–124.
[22] A. Stoianov, “Cryptographically secure biometric,” in SPIE Biometric
Technology forHumanIdentification VII,volume7667,2010.
[23] P.Failla, Y.Sutcu,andM.Barni, “eSketch: aprivacy-preserving fuzzy
commitment scheme for authentication using encrypted biometrics,”
in Proc. of the 12th ACM workshop on Multimedia and security
(MMSec’10). ACM,2010,pp.241–246.
[24] M. D. Raimondo, M. Barni, D. Catalano, R. D. Labati, P. Failla,
T. Bianchi, D. Fiore, R. Lazzeretti, V. Piuri, F. Scotti, and A. Piva,
“Privacy-preservingfingercodeauthentication,”inProc.ofthe12thACM
workshop onMultimedia andsecurity (MMSec’10). ACM, 2010, pp.
231–240.
[25] J.Bringer,H.Chabanne,andK.Simoens,“Blackboxsecurityofbiomet-
rics (invited paper),” inIntelligent Information Hiding andMultimedia
Signal Processing (IIH-MSP), 2010 6th Int. Conf. on, 2010, pp. 337
–340.
[26] Q.Tang,J.Bringer,H.Chabanne,andD.Pointcheval,“Aformalstudyof
theprivacyconcernsinbiometric-basedremoteauthenticationschemes,”
inISPEC,ser.LNCS,L.Chen,Y.Mu,andW.Susilo,Eds.,vol.4991.
Springer, 2008,pp.56–70.
[27] A. K. Jain, P. Flynn, and A. A. Ross, Eds., Handbook of Biometrics.
Springer, 2008.
[28] S.GoldwasserandS.Micali,“Probabilisticencryptionandhowtoplay
mental pokerkeeping secret allpartial information,” inSTOC. ACM,
1982,pp.365–377.
[29] K.CrammerandY.Singer,“Onthealgorithmicimplementationofmul-
ticlasskernel-basedvectormachines,”J.ofMachineLearningResearch,
vol.2,pp.265–292,2001.