 Research Article
 Open Access
Binary Biometric Representation through Pairwise Adaptive Phase Quantization
 Chun Chen^{1}Email author and
 Raymond Veldhuis^{1}
https://doi.org/10.1155/2011/543106
© Chun Chen and Raymond Veldhuis. 2011
 Received: 18 October 2010
 Accepted: 24 January 2011
 Published: 20 February 2011
Abstract
Extracting binary strings from realvalued biometric templates is a fundamental step in template compression and protection systems, such as fuzzy commitment, fuzzy extractor, secure sketch, and helper data systems. Quantization and coding is the straightforward way to extract binary representations from arbitrary realvalued biometric modalities. In this paper, we propose a pairwise adaptive phase quantization (APQ) method, together with a longshort (LS) pairing strategy, which aims to maximize the overall detection rate. Experimental results on the FVC2000 fingerprint and the FRGC face database show reasonably good verification performances.
Keywords
 Binary String
 False Acceptance Rate
 False Rejection Rate
 Feature Pair
 Biometric Template
1. Introduction
Extracting binary biometric strings is a fundamental step in template compression and protection [1]. It is well known that biometric information is unique, yet inevitably noisy, leading to intraclass variations. Therefore, the binary strings are desired not only to be discriminative, but also have to low intraclass variations. Such requirements translate to both low false acceptance rate (FAR) and low false rejection rate (FRR). Additionally, from the template protection perspective, we know that general biometric information is always public, thus any person has some knowledge of the distribution of biometric features. Furthermore, the biometric bits in the binary string should be independent and identically distributed (i.i.d.), in order to maximize the attacker's efforts in guessing the target template.
Several biometric template protection concepts have been published. Cancelable biometrics [2, 3] distort the image of a face or a fingerprint by using a oneway geometric distortion function. The fuzzy vault method [4, 5] is a cryptographic construction allowing to store a secret in a vault that can be locked using a possibly unordered set of features, for example, fingerprint minutiae. A third group of techniques, containing fuzzy commitment [6], fuzzy extractor [7], secure sketch [8], and helper data system [9–13], derive a binary string from a biometric measurement and store an irreversibly hashed version of the string with or without binding a crypto key. In this paper, we adopt the third group of techniques.
The categorized onedimensional quantizers.
User independent  User specific 

Linnartz and Tuyls [9]  Vielhauer et al. [14] 
Tuyls et al. [10]  Feng and Wah [15] 
Kevenaar et al. [11]  Chang et al. [16] 
Chen et al. [17]  
Equal width  Equal probability 
Linnartz and Tuyls [9]  Tuyls et al. [10] 
Vielhauer et al. [14]  Kevenaar et al. [11] 
Feng and Wah [15]  Chen et al. [17] 
Chang et al. [16] 
Apart from the onedimensional quantizer design, some papers focus on assigning a varying number of quantization bits to each feature. So far, several bit allocation principles have been proposed: fixed bit allocation (FBA) [10, 11, 17] simply assigns a fixed number of bits to each feature. On the contrary, the detection rate optimized bit allocation (DROBA) [19] and the area under the FRR curve optimized bit allocation (AUFOBA) [20], assign a variable number of bits to each feature, according to the features' distinctiveness. Generally, AUFOBA and DROBA outperform FBA.
Since the phase quantization has shown in [21] to yield a good performance, in this paper, we propose a userspecific adaptive phase quantizer (APQ). Furthermore, we introduce a Mahalanobis distancebased longshort (LS) pairing strategy that by good approximation maximizes the theoretical overall detection rate at zero Hamming distance threshold.
In Section 2 we introduce the adaptive phase quantizer (APQ), with simulations in a particular case with independent Gaussian densities. In Section 3 the longshort (LS) pairing strategy is introduced to compose pairwise features. In Section 4, we give some experimental results on the FVC2000 fingerprint database and the FRGC face database. In Section 5 the results are discussed and conclusions are drawn in Section 6.
2. Adaptive Phase Quantizer (APQ)
In this section, we first introduce the APQ. Afterwards, we discuss its performance in a particular case where the feature pairs have independent Gaussian densities.
2.1. Adaptive Phase Quantizer (APQ)
Essentially, APQ has both equalwidth and equalprobability intervals, with rotation offset that maximizes the detection rate.
2.2. Simulations on Independent Gaussian Densities
3. Biometric Binary String Extraction
The APQ can be directly applied to twodimensional features, such as Iris [22], while for arbitrary features, we have the freedom to pair the features. In this section, we first formulate the pairing problem, which in practice is difficult to solve. Therefore, we simplify this problem and then propose a longshort pairing strategy (LS) with low computational complexity.
3.1. Problem Formulation
The aim for extracting biometric binary string is for a genuine user who has features, we need to determine a strategy to pair these features into pairs, in such way that the entire bit binary string ( ) obtains optimal classification performance, when every feature pair is quantized by a bit APQ. Assuming that the feature pairs are statistically independent, we know from [19] that when applying a Hamming distance classifier, zero Hamming distance threshold gives a lower bound for both the detection rate and the FAR. Therefore, we decide to optimize this lower bound classification performance.
The detection rate given a feature pair is computed from (8). Considering that the performance at zero Hamming distance threshold indeed pinpoints the minimum FAR and detection rate value on the receiver operating characteristic curve (ROC), optimizing such point in (15) essentially provides a maximum lower bound for the ROC curve.
3.2. LongShort Pairing
There are two problems in solving (15): first, it is often not possible to compute in (8), due to the difficulties in estimating the genuine user PDF . Additionally, even if the can be accurately estimated, a bruteforce search would involve evaluations of the overall detection rate, which renders a bruteforce search unfeasible for realistic values of . Therefore, we propose to simplify the problem definition in (15) as well as the optimization searching approach.
Simplified Problem Definition
with defined in (11). Furthermore, instead of brute force searching, we propose a simplified optimization searching approach: the longshort (LS) pairing strategy.
LongShort (LS) Pairing
The computational complexity of the LS pairing is only . Additionally, it is applicable to arbitrary feature types and independent of the number of quantization bits . Note that this LS pairing is similar to the pairing strategy proposed in [21], where Euclidean distances are used. In fact, there are other alternative pairing strategies, for instance greedy or longlong pairing [21]. However, in terms of the entire binary string performance, these methods are not as good as the approach presented in this paper, especially when is large. Therefore, in this paper, we choose the longshort pairing strategy, providing a compromise between the classification performance and computational complexity.
4. Experiments
In this section we test the pairwise phase quantization (LS + APQ) on real data. First we present a simplified APQ, which is employed in all the experiments. Afterwards, we verify the relation between and for real data. We also show some examples of LS pairing results. Then we investigate the verification performances while varying the input feature dimensions ( ) and the number of quantization bits per feature pair ( ). The results are further compared to the onedimensional fixed quantization (1D FQ) [17] as well as the the FQ in combined with the DROBA bit allocation principle (FQ + DROBA).
4.1. Experimental Setup
We tested the pairwise phase quantization on two real data sets: the FVC2000(DB2) fingerprint database [23] and the FRGC(version 1) face database [24].
(i)FVC2000: The FVC2000(DB2) fingerprint data set contains 8 images of 110 users. The features were extracted in a fingerprint recognition system that was used in [10]. As illustrated in Figure 8, the raw features contain two types of information: the squared directional field in both and directions and the Gabor response in 4 orientations (0, , , ). Determined by a regular grid of 16 by 16 points with spacing of 8 pixels, measurements are taken at 256 positions, leading to a total of 1536 elements.
(ii)FRGC: The FRGC(version 1) face data set contains 275 users with a different number of images per user, taken under both controlled and uncontrolled conditions. The number of samples per user ranges from 4 to 36. The image size was . From that a region of interest (ROI) with 8762 pixels was taken as illustrated in Figure 9.
A limitation of biometric compression or protection is that it is not possible to conduct the userspecific image alignment, because the image or other alignment information cannot be stored. Therefore, in this paper, we applied basic absolute alignment methods: the fingerprint images are aligned according to a standard core point position; the face images are aligned according to a set of four standard landmarks, that is, eyes, nose and mouth.
Data division: number of users × number of samples per user(s), and the number of trials for FVC2000 and FRGC. The s is a parameter that varies in the experiments.
Training  Enrollment  Verification  Trials  

FVC2000 


 20 
FRGC 


 5 
Our experiments involved three steps: training, enrollment, and verification. (1) In the training step, we first applied a combined PCA/LDA method [25] on a training set. The obtained transformation was then applied to both the enrollment and verification sets. We assume that the measurements have a Gaussian density, thus after the PCA transformation, the extracted features are assumed to be statistically independent. The goal of applying PCA/LDA in the training step is to extract independent features so that by pairing them we could subsequently obtain independent feature pairs, which meet our problem requirements. Note that for FVC2000, since we have only 80 users in the training set, applying LDA results in very limited number of features (e.g., ). Therefore, we relax the independency requirement for the genuine user by applying only the PCA transformation. (2) In the enrollment step, for every genuine user , the LS pairing was first applied, resulting in the userspecific pairing configuration . The pairwise features were further quantized through a bit APQ with the adaptive angle , and assigned with a Gray code [26]. The concatenation of the codes from feature pairs formed the bit target binary string . Both and the quantization information ( ) were stored for each genuine user. (3) In the verification step, the features of the query user were quantized and coded according to the quantization information ( ) of the claimed identity, leading to a query binary string . Finally, the decision was made by comparing the Hamming distance between the query and the target string.
4.2. Simplified APQ
4.3. APQ  Property
In this section we test the relation between the APQ detection rate and the pairwise feature's distance on both data sets. The goal is to see whether the real data exhibit the same property as we found with synthetic data in Section 2.2: the feature pairs with higher obtains higher detection rate .
4.4. LS Pairing Performance
4.5. Verification Performance
We test the performances of LS + APQ at various numbers of input features as well as various numbers of quantization bits . The performances are further compared with the onedimensional fixed quantization (1D FQ) [17]. The EER results for FVC2000 and FRGC are shown in Table 3 and Figure 15.
The EER performances of LS + APQ and 1D FQ, at various feature dimensionality and various numbers of quantization bits , for (a) FVC2000 and (b) FRGC.
FVC2000  , EER = (%)  

 100  150  200  250  300  
LS + APQ 
 4.4  2.8  2.0  1.9  1.8  1.9 
 4.6  3.0  2.0  2.1  1.7  1.6  
 6.4  3.7  2.8  2.6  2.5  2.7  
 8.2  5.9  4.6  3.4  3.2  3.3  
 10.0  6.6  5.9  4.4  4.0  3.7  
 11.4  7.1  6.6  5.4  4.7  4.7  
1D FQ 
 6.7  4.0  2.9  2.6  2.7  2.3 
 7.5  5.3  4.2  3.6  3.6  3.6  
 9.2  6.4  5.5  5.0  5.2  4.9 
FRGC  , , EER = (%)  

 80  100  120  150  180  200  
LS + APQ 
 4.0  3.4  3.0  2.6  2.9  2.7  2.7 
 3.5  3.0  2.8  2.3  2.8  2.7  2.9  
 4.7  4.1  3.7  3.4  3.3  3.6  3.9  
 6.7  5.9  5.0  4.8  4.7  5.0  5.2  
 8.1  7.0  6.3  6.1  6.5  6.6  6.4  
 10.1  8.6  7.5  7.2  7.2  7.4  7.6  
1D FQ 
 5.7  4.7  4.2  4.0  4.1  4.1  4.2 
 5.1  5.4  5.1  5.0  5.2  5.9  6.1  
 6.5  6.5  6.4  6.2  6.5  6.9  7.3 
The FAR/FRR performances for FVC2000 and FRGC at the best  setting.
FRR (%) 




FVC2000, ,  17.2  9.6  2.6 
FRGC, ,  14.7  8.2  3.7 
In [19], it was shown that FQ in combination with the DROBA adaptive bit allocation principle (FQ + DROBA) provides considerably good performances. Therefore, we compare the LS + APQ with the FQ + DROBA. In order to compare both methods at the same  setting, for LS + APQ, we extract only features from the features, thus pairs from the LS pairing. Afterwards, we apply the 2bit APQ for every feature pair (see Figure 3). In this case, . Table 6 shows the EER performances of LS + APQ and FQ + DROBA at several different  settings. Results show that LS + APQ obtains slightly better performances than FQ + DROBA.
The EER performances of LS + APQ and FQ + DROBA, at at several  settings, for (a) FVC2000 and (b) FRGC.
FVC2000  , EER = (%)  



 
LS + APQ  2.3  1.7  1.9 
FQ + DROBA  2.4  2.1  2.2 
FRGC  , EER = (%)  



 
LS + APQ  2.3  2.4  2.3 
FQ + DROBA  2.4  2.6  2.8 
5. Discussion
6. Conclusion
Extracting binary biometric strings is a fundamental step in biometric compression and template protection. Unlike many previous work which quantize features individually, in this paper, we propose a pairwise adaptive phase quantization (APQ), together with a longshort (LS) pairing strategy, which aims to maximize the overall detection rate. Experimental results on the FVC2000 and the FRGC database show reasonably good verification performances.
Declarations
Acknowledgment
This research is supported by the research program Sentinels (http://www.sentinels.nl/). Sentinels is being financed by Technology Foundation STW, the Netherlands Organization for Scientific Research (NWO), and the Dutch Ministry of Economic Affairs.
Authors’ Affiliations
References
 Jain AK, Nandakumar K, Nagar A: Biometric template security. EURASIP Journal on Advances in Signal Processing 2008., 2008:Google Scholar
 Ratha NK, Connell JH, Bolle RM: Enhancing security and privacy in biometricsbased authentication systems. IBM Systems Journal 2001, 40(3):614634.View ArticleGoogle Scholar
 Ratha NK, Chikkerur S, Connell JH, Bolle RM: Generating cancelable fingerprint templates. IEEE Transactions on Pattern Analysis and Machine Intelligence 2007, 29(4):561572.View ArticleGoogle Scholar
 Juels A, Sudan M: A fuzzy vault scheme. Designs, Codes, and Cryptography 2006, 38(2):237257. 10.1007/s106230056343zMathSciNetView ArticleMATHGoogle Scholar
 Nandakumar K, Jain AK, Pankanti S: Fingerprintbased fuzzy vault: implementation and performance. IEEE Transactions on Information Forensics and Security 2007, 2(4):744757.View ArticleGoogle Scholar
 Juels A, Wattenberg M: Fuzzy commitment scheme. Proceedings of the 6th ACM Conference on Computer and Communications Security (ACM CCS '99), November 1999 2836.View ArticleGoogle Scholar
 Dodis Y, Reyzin L, Smith A: Fuzzy extractors: how to generate strong keys from biometrics and other noisy data. Proceedings of International Conference on the Theory and Applications of Cryptographic Techniques, May 2004, Lecture Notes in Computer Science 3027: 523540.MathSciNetMATHGoogle Scholar
 Chang EC, Roy S: Robust extraction of secret bits from minutiae. Proceedings of the 2nd International Conference on Biometrics (ICB '07), 2007, Lecture Notes in Computer Science 4642: 750759.Google Scholar
 Linnartz JP, Tuyls P: New shielding functions to enhance privacy and prevent misuse of biometrie templates. Proceedings of Audioand VideoBased Biometrie Person Authentication (AVBPA '03), 2003, Guildford, UK, Lecture Notes in Computer Science 2688: 393402.Google Scholar
 Tuyls P, Akkermans AHM, Kevenaar TAM, Schrijen GJ, Bazen AM, Veldhuis RNJ: Practical biometric authentication with template protection. Proceedings of the 5th International Conference on Audioand VideoBased Biometric Person Authentication (AVBPA '05), July 2005, Hilton Rye Town, NY, USA, Lecture Notes in Computer Science 3546: 436446.View ArticleGoogle Scholar
 Kevenaar TAM, Schrijen GJ, van der Veen M, Akkermans AHM, Zuo F: Face recognition with renewable and privacy preserving binary templates. Proceedings of the 4th IEEE Workshop on Automatic Identification Advanced Technologies (AUTO ID '05), October 2005, New York, NY, USA 2126.View ArticleGoogle Scholar
 Hao F, Anderson R, Daugman J: Combining crypto with biometrics effectively. IEEE Transactions on Computers 2006, 55(9):10811088.View ArticleGoogle Scholar
 Teoh ABJ, Goh A, Ngo DCL: Random multispace quantization as an analytic mechanism for BioHashing of biometric and random identity inputs. IEEE Transactions on Pattern Analysis and Machine Intelligence 2006, 28(12):18821901.View ArticleGoogle Scholar
 Vielhauer C, Steinmetz R, Mayerhöfer A: Biometric hash based on statistical features of online signatures. Proceedings of the 16th International Conference on Pattern Recognition (ICPR '02), 2002, Quebec, Canada 1: 123126.Google Scholar
 Feng H, Wah CC: Private key generation from online handwritten signatures. Information Management and Computer Security 2002, 10(4):159164. 10.1108/09685220210436949View ArticleGoogle Scholar
 Chang YJ, Zhang W, Chen T: Biometricsbased cryptographic key generation. Proceedings of the IEEE International Conference on Multimedia and Expo (ICME '01), June 2004, Taipei, Taiwan 3: 22032206.Google Scholar
 Chen C, Veldhuis RNJ, Kevenaar TAM, Akkermans AHM: Multibits biometric string generation based on the likelihood ratio. Proceedings of the 1st IEEE International Conference on Biometrics: Theory, Applications, and Systems (BTAS '07), September 2007Google Scholar
 Chen C, Veldhuis RNJ, Kevenaar TAM, Akkermans AHM: Biometric binary string generation with detection mate optimized bit allocation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR '08), June 2008Google Scholar
 Chen C, Veldhuis RNJ, Kevenaar TAM, Akkermans AHM: Biometric quantization through detection rate optimized bit allocation. EURASIP Journal on Advances in Signal Processing 2009., 2009:Google Scholar
 Chen C, Veldhuis RNJ: Extracting biometric binary strings with minimal area under the frr curve for the hamming distance classifier. Proceedings of the 17th European Signal Processing Conference (EUSIPCO '09), 2009Google Scholar
 Chen C, Veldhuis R: Binary biometric representation through pairwise polar quantization. Proceedings of the 3rd International Conference on Advances in Biometrics (ICB '09), June 2009, Alghero, Italy, Lecture Notes in Computer Science 5558: 7281.View ArticleGoogle Scholar
 Daugman J: The importance of being random: statistical principles of iris recognition. Pattern Recognition 2003, 36(2):279291. 10.1016/S00313203(02)000304View ArticleGoogle Scholar
 Maio D, Maltoni D, Cappelli R, Wayman JL, Jain AK: FVC2000: fingerprint verification competition. IEEE Transactions on Pattern Analysis and Machine Intelligence 2002, 24(3):402412. 10.1109/34.990140View ArticleGoogle Scholar
 Phillips PJ, Flynn PJ, Scruggs T, Bowyer KW, Chang J, Hoffman K, Marques J, Min J, Worek W: Overview of the face recognition grand challenge. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '05), June 2005, San Diego, Calif, USA 947954.Google Scholar
 Veldhuis R, Bazen A, Kauffman J, Hartel P: Biometric verification based on grippattern recognition. Security, Steganography, and Watermaking of Multimedia Contents VI, January 2004, San Jose, Calif, USA, Proceedings of SPIE 5306: 634641.View ArticleGoogle Scholar
 Gardner M: The Binary Gray Code. W. H. Freeman, New York, NY, USA; 1986.Google Scholar
Copyright
This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.