 Research
 Open Access
 Published:
Texture based features for robust palmprint recognition: a comparative study
EURASIP Journal on Information Security volume 2015, Article number: 5 (2015)
Abstract
Palmprint is a widely used biometric trait deployed in various accesscontrol applications due to its convenience in use, reliability, and low cost. In this paper, we propose a novel scheme for palmprint recognition using a sparse representation of features obtained from Bank of Binarized Statistical Image Features (BBSIF). The palmprint image is characterized by a rich set of features including principal lines, ridges, and wrinkles. Thus, the use of an appropriate texture descriptor scheme is expected to capture this information accurately. To this extent, we explore the idea of BBSIF that comprises of 56 different BSIF filters whose responses on the given palmprint image is processed independently and classified using sparse representation classifier (SRC). Extensive experiments are carried out on three different largescale publicly available palmprint databases. We then present an extensive analysis by comparing the proposed scheme with seven different contemporary stateoftheart schemes that reveals the efficacy of the proposed scheme for robust palmprint recognition.
Introduction
Biometric systems are widely used in access control and securitybased applications. The goal of the biometric system is to utilize physical and/or behavior characteristics to identify/verify the subject of interest. There exist various kinds of biometric systems that are based on physical and/or behavioral cues such as the face, iris, speech, keystroke, palmprint, retina, and so on. Among these, the palmprintbased biometric system that has been investigated for over 15 years has demonstrated its applicability as a successful biometric modality. Palmprints exhibit a unique characteristic that can be characterized using texture features that are contributed due to the presence of palm creases, wrinkles, and ridges. Furthermore, the palmprints can be captured using lowcost sensors with a very lowresolution imaging of 75 dotsperinch (dpi) [1, 2]. Further, recent work [3] has demonstrated the antispoofing nature of palmprints that places the palmprint as a highly reliable biometric characteristic.
The increasing popularity of the palmprint biometrics has resulted in various feature extraction techniques that have contributed to boosting the accuracy of palmprint verification. The available techniques can be broadly classified into the following five types, namely: (1) local featurebased approaches, (2) statisticalbased approaches, (3) appearance based approaches, (4) texture based approaches, and (5) hybrid approaches. The local feature extraction techniques are based on extracting the feature such as ridges that include both delta points, minutiae, and palm creases (or principle lines). The local features from the palmprint can be extracted using various techniques that includes line segment approach [4], morphological median wavelet [5], Sobel operator [6], Canny operators [6], Plessy operator [7], and wideline detection operator [8]. Even though the local features are proven to achieve the accurate performance, these methods demand very highresolution palmprint images to be captured and thereby increases the cost of the sensor. The statisticalbased approaches are based on extracting the features that correspond to mean, variance, moments, and energy. There exist various techniques to capture the statistics of the palmprint that includes wavelet transform [9], Fourier transform [10], Cepstrum energy [11], subblock energy based on Gabor transform [12, 13], microscale invariant Gabor [14], Zernike moments [15]. However, the use of the statisticsbased approaches are not robust against the sensor noise. The appearancebased approaches perform the data mapping from high dimension to low dimension to achieve high accuracy as well as speed in comparison. The most popular appearancebased techniques includes Principal Component Analysis (PCA) [16], 2DPCA [17], bidirectional PCA [18], (2D)^{2}PCA [19], independent component analysis (ICA) [20], linear discriminant analysis (LDA) [21], kernelbased approaches like kernel discriminant analysis (KDA) [13], kernel PCA (KPCA) [22], and generative modelbased approaches, namely: PCA mixture model(PCAMM) and ICA mixture model (ICAMM) [23]. Even though the use of the appearance model can perform equally well with the statistics approach, it still lacks the robustness against variation in noise as well as variation in palmprint templates with time. The texturebased schemes normally extract the global patterns of lines, ridges, and wrinkles that constitute for the robust palmprint recognition. Among the available texture extraction schemes, the use of local binary patterns (LBP) [24], Gabor transform [13], palmcode [25], ordinal code [26], fusion code [27], competitive code [28], and contour code [29] have shown to perform accurately even on lowresolution palmprint images. The hybrid scheme [30, 31] combines more than one of the abovementioned schemes so that it can address shortcomings of individual schemes. When compared to all five different types of schemes, the hybrid schemes appear to be more robust and accurate for the palmprint recognition. Table 1 shows the characteristics of the existing palmprint feature extraction schemes in terms of computation complexity and accuracy. The detailed survey on the palmprint recognition can be found in [32, 33].
In this work, we propose a simple and novel approach for palmprint verification based on the sparse representation of features derived from the Bank of Binarized Statistical Image Features (BBSIF) [34]. BSIF [34] is a texture descriptor similar to the LBP, but the difference lies in the way the filters are learned. In the case of BSIF, filters are learned from the natural images while the LBP filters are manually predefined. To the best of our knowledge, no work has been reported in the literature which uses Binarized Statistical Image Features (BSIF) for palmprint verification. With this backdrop, in our previous work [35], we made an initial attempt towards the sparse representation of BSIF. In this paper, the same work is extended in many directions. By exploiting the idea of BBSIF, filters that include 56 different filters allowed us to reduce significantly the equal error rate (EER). Overall, the following are the main contributions of this work:

A new method based on the Bank of BSIF (BBSIF) and sparse representation classifier (SRC) for palmprint recognition.

Extensive experiments are carried out on the following three different palmprint databases, namely: PolyU contact palmprint database [36] with 356 subjects, IIT Delhi contactless palmprint database [37] with 236 subjects, and Multispectral palmprint PolyU database [3] with 500 subjects.

Comprehensive analysis by comparing the proposed scheme with seven different stateoftheart contemporary schemes based on LBP [24], palmcode [25], ordinal code [26], fusion code [27], Gabor transform with KDA [13], Gabor transform with sparse representation [38], and also with our previously proposed technique based on the sparse representation of BSIF [35].
All in all, the proposed framework, being a simple, novel, and first of its kind in the literature of palmprint verification/recognition, is expected to open up a new dimension for further research in the field of palmprint biometrics.
The rest of the paper is structured as follows: Section 2 presents the proposed scheme for robust palmprint recognition, Section 3 discusses the experimental setup, protocols, and results, and Section 4 draws the conclusion.
Proposed method
Figure 1 shows the block diagram of the proposed Bank of BSIF and sparse representation classifier (SRC) based scheme for palmprint recognition. The proposed scheme can be structured in two main steps.
Region of interest extraction
The main idea of the region of interest (RoI) is to extract the significant region from the palmprint that constitutes for the rich set of features such as principal lines, ridges, and wrinkles by compensating for rotation and translation. The accurate extraction of RoI plays a crucial role in improving the performance of the overall palmprint recognition. In this work, we have employed the algorithm proposed in [23] which is based on aligning the palmprint by computing the center of mass and also by locating the valley regions. We carried out this RoI extraction scheme only on the PolyU palmprint database as the other two databases (MSPolyU and IITD) have already provided the RoI images.
Bank of BSIF features and sparse representation classifier
The idea behind the proposed BBSIF is to construct a bank of filters that is trained using a set of natural images. Traditionally, one can train BBSIF filters in an unsupervised manner using the most popular techniques, namely: Restricted Boltzmann Machines (RBMs) [39, 40], autoencoders [41], sparse coding [42], and independent component analysis [43, 44]. Among these schemes, the use of ICA is a more appealing choice as it overcomes the tuning of large sets of hyperparameters and can also provide a statistically independent basis that in turn can be used as the filter to extract the features from the given image. Thus, given the natural images, we first normalize to have a zero mean and unit variance [34]. Then, we sample N _{ Im } number of patches to learn the BSIF filters using ICA. Thus, the size of the image patch sampled from natural images will fix the size of the BSIF filter to be learned and the selection of the number of top ICA basis will indicate the length of the BSIF filter. For instance, BSIF filter with the size 5×5 and length 8 corresponds to the top 8 basis of the ICA algorithm learned using an image patch size of 5×5 sampled from the natural image. Thus, by varying both size and length, one can learn some BSIF filters from the natural images. In this work, we consider 56 different prelearned filters with varying size and length that can constitute a Bank of BSIF (BBSIF) filter.
In this work, we have employed the opensource filters [34] that are learned using 50,000 image patterns randomly sampled from 13 different natural images [45]. The learning process to construct these statistically independent filters has three main steps (1) mean subtraction of each patch, (2) dimensionality reduction using principle component analysis (PCA), (3) estimation of statistically independent filters (or basis) using independent component analysis (ICA). Thus, given the palmprint image I _{ P }(m,n) and a BSIF filter \(W_{i}^{k \times k}\), the filter response is obtained as follows [34]:
Where × denotes convolution operation, m and n denote the size of the palmprint image patch, and \(W_{i}^{k \times k}\), ∀i={1,2,…,L} denotes the length of the BSIF filter and k×k indicates the size of the BSIF filter, whose response can be computed together and binarized to obtain the binary string as follows [34]:
Finally, the BSIF features are extracted by considering each single pixel (m, n) as a set of binary values obtained from the L number of linear filters. Mathematically, for a given pixel (m, n) and its corresponding binary representation b _{ i }(m,n), BSIF encoded features are obtained as follows:
The whole procedure of BSIF extraction is illustrated in Fig. 2 by considering a BSIF filter \( W_{8}^{17 \times 17}\) which is of length 8 and size of 17×17. Figure 2 a indicates the input ROI of the palmprint image. Figure 2 b shows the learned BSIF filter with a size 17×17 and of length 8. Figure 2 c shows the results of the individual convolution of the palmprint image with BSIF filter as mentioned in Eq. 1. Figure 2 d shows the final BSIF feature encoded using Eq. 3 obtained on the palmprint ROI shown in Fig. 2 a.
To achieve the good performance in palmprint recognition using BSIF, we need to consider two important factors, namely: filter size and filter length. However, the use of a single filter with a fixed length may not be capable of capturing sufficient information to achieve accurate palmprint recognition. Thus, in this work, we propose to use the bank of filters with varying filter size and length. The filter size is varied from 5×5 to 17×17 in steps of two such that we have filters of seven different sizes. In a similar manner, we also vary the length of the filter (or a number of the independent components) from 5 till 12 in steps of 1 to get 8 different lengths. Thus, our ensemble has 7×8=56 filters such that the response for palmprint image is obtained independently. Given a palmprint sample P(m,n), we get 56 independent BSIF coded images R _{ P }={R _{ P1},R _{ P2},…,R _{ P56}}.
Figure 3 illustrates various BSIF filters that are included in the proposed BBSIF scheme. Figure 4 shows the qualitative results on the example palmprint with varying filter size with a fixed bit length of 8. It is interesting to observe here that as the filter size increases the distinctive information about the coarse palm lines also increases. Thus, the use of the various lengths of filters will also result in capturing various information from the palmprint. Furthermore, the variation in wrinkles and ridges among different palmprints can be accurately characterized using a bank of BSIF filter rather than using a single BSIF filter.
Given the palmprint sample I _{ P }(m,n), we obtain the response of I _{ P }(m,n) to all BSIF filters in BBSIF; we then perform the sparse representation of these obtained response individually on each filter. Thus, the sparse representation of features obtained from each filter in the BBSIF can be carried out as follows:

1.
Given the reference palmprint samples, we first extract the BSIF features (corresponding to one filter) and construct a training T _{ r } for all C classes (or subjects) as follows:
$$ T_{r} = \left[T_{r1}, T_{r2}, \ldots,T_{rC}\right] \in \Re^{N \times \left(n_{u}.C\right)} $$((4))Where, n _{ u } denotes the number of reference samples for each class and N indicates the dimension of the BSIF features obtained on n _{ u } reference samples from C classes (or subjects).

2.
Given that the test (or probe) sample T _{ e } obtains the BSIF features (corresponding to same filter as above) that can be considered as a linear combination of the training vectors as:
$$ T_{e} = T_{r}\alpha $$((5))Where,
$$ \alpha = \left[ \alpha_{1}, \ldots, \alpha_{1n_{u}},\alpha_{2}, \ldots, \alpha_{2n_{u}}, \ldots, \alpha_{C1}, \ldots,\alpha_{{Cn}_{u}}\right] $$((6)) 
3.
Solve l _{1} minimization problem [46] as follows:
$$ \hat\alpha = \arg \min_{\alpha^{'} \in \Re^{N}} \ \alpha^{'}\_{1} T_{e} = T_{r}\alpha^{'} $$((7)) 
4.
Calculate the residual as follows:
$$ r_{c}(y) = \T_{e}  \Pi_{C}(\alpha^{'})\_{2} $$((8)) 
5.
Finally, obtain the comparison score as the residual errors to compute the performance of the overall system.
Finally, we repeat the above mentioned steps 1–5 on all 56 different BSIF filters in the bank and obtain the final comparison score that corresponds to the minimum of residual errors obtained on all 56 filters in the bank.
Experimental results and discussion
This section presents the experimental results obtained on the proposed scheme for palmprint recognition. Extensive experiments are carried out on three different largescale publicly available palmprint databases such as: (1) PolyU palmprint database [36], (2) IIT Delhi palmprint database [37], and (3) Multispectral palmprint PolyU database [3]. All the experimental results are presented in terms of equal error rate (EER), and we also present the statistical validation of the results with 90 % confidence interval [13]. In the following section, we present the experimental protocol adopted in this work.
Assessment protocol
This section describes the evaluation protocol adopted in this work on three different palmprint databases that are the same with our previous paper [35].
PolyU palmprint database This database comprises of 352 subjects such that each subject has ten samples collected in two different sessions. For our experiments, we consider all ten samples from the first session as a reference and all samples from the second session as probe samples. The database is available for research purposes at http://www4.comp.polyu.edu.hk/~biometrics/MultispectralPalmprint/MSP.htm.
IIT Delhi palmprint database This database consists of 235 subjects with both left and right palmprint samples. Each subject has five samples captured independently from both left and right palmprints. To evaluate this database, we consider four samples as the reference and remaining one sample as the probe sample. We repeat this selection of reference and probe samples using leavingoneout cross validation with k = 10, and finally, we present the result by averaging the performance overall ten runs. The database is available for research purposes at http://www4.comp.polyu.edu.hk/~csajaykr/IITD/Database_Palm.htm.
MultiSpectral PolyU palmprint database This database consists of 500 subjects whose palmprint samples are captured in two different sessions in four different spectra: blue, red, green, and near infrared (NIR). Each session has six samples per subject. Thus, we select samples from first session as reference samples while we select second session samples as a probe. We repeat this procedure for all four spectral bands, and results are presented independently. The database is available for research purposes at http://www4.comp.polyu.edu.hk/~biometrics/MultispectralPalmprint/MSP.htm.
Results and discussion
Figure 5 shows the qualitative results of the proposed scheme along with the five different stateoftheart schemes employed in this work. It can be observed here that, the use of BSIF features appears to capture more accurate palmprint features that are characterized in terms of ridges and wrinkles. This qualitative result shows the superiority of the proposed BSIF features for palmprint recognition.
Table 2 shows the performance of the proposed scheme based on BBSIF and SRC on the PolyU palmprint database. It can be observed here that, the proposed scheme shows the best performance with an EER of 4.06 % and thereby indicates the performance improvement over 2 % compared to our previous scheme [35] based on single BSIF features. This further justifies the applicability of the proposed scheme for palmprint recognition.
Table 3 tabulates the quantitative performance of the proposed scheme on the IITD contactless palmprint database. Here, we present the results individually on both left and right palmprint samples. As noticed from the Table 3, the proposed scheme shows the outstanding performance with an EER of 0.12 % on the left palmprint samples and 0.72 % on the right palmprint samples. This indicates the performance improvement of over 1 % compared to our previous scheme [35] based on single BSIF features. This further justifies the applicability of the proposed scheme for palmprint recognition on yet another kind of database where palmprint samples are captured in a contactless fashion.
Table 4 shows the performance of the proposed scheme on the multispectral PolyU palmprint database. Here also it can be noted that the proposed scheme has achieved the outstanding performance with an EER of 0 %. These results further justify the applicability of the proposed scheme on different palmprint samples that are captured with different spectral bands.
Thus, from the above experiments, it can be observed that the proposed scheme has shown the best performance when compared with five wellestablished stateoftheart schemes for the palmprint recognition. Further, the performance achieved using the proposed scheme on three different databases justifies its robustness and applicability of the palmprint recognition.
Table 5 shows the computation time of the various algorithms used in this work. All the algorithms are developed in a Matlab software running on a PC with Intel i7 processor—8 Gb RAM and Windows 7. Note that all the described algorithms are implemented and not optimized to run fast hence computation time provided in the Table 5 is just a reference.
Conclusions
Accurate representation of the features plays a vital role in improving the accuracy and reliability of the palmprint recognition. In this paper, we have introduced a novel approach for the palmprint recognition based on BBSIF and SRC. The main idea of the proposed method is to use multiple BSIF filters with various size and length to constitute an ensemble (or bank of BSIF filters). Since each of these BSIF filters are learned on the natural images using the independent component analysis (ICA), they exhibit the property of statistical independence. We proposed to build the BBSIF with 56 different BSIF filters. Then, each of these filters is associated with the SRC that essentially perform the sparse representation of each BSIF filter. Thus, given a palmprint sample, we obtain its response on each of the BSIF filter and then obtain the corresponding comparison score using SRC. Finally, we select the best comparison score that corresponds to the minimum value of the residual error. The proposed method is validated by conducting extensive experiments on three different largescale publicly available databases that indicated the outstanding performance. The performance of the proposed scheme is compared with seven wellestablished stateoftheart schemes. The obtained results justify that the proposed scheme has emerged as an efficient and robust tool for accurate palmprint recognition.
References
 1
D Zhang, Palmprint authentication (Springerverlag, Guangzhou, China, 2004).
 2
A Genovese, V Piuri, F Scotti, Advances in information security, vol. 60. Springer, (2014). doi:http://dx.doi.org/9783319103655
 3
D Zhang, Z Guo, G Lu, YLL Zhang, W Zuo, Online joint palmprint and palmvein verification. Expert Syst. Appl. 38(3), 2621–2631 (2011).
 4
W Shu, D Zhang, in Pattern Recognition, 1998. Proceedings. Fourteenth International Conference On, 1. Palmprint verification: an implementation of biometric technology, pp. 219–221. Brisbane, Qld, 1998.
 5
D qingyun, Y yinglin, Z dapeng, A line feature extraction method based on morphological median pyramid. J. South China Univ. Technol. 29(5), 14–18 (2001).
 6
CC Han, HL Cheng, CL Lin, KC Fan, Personal authentication using palmprint features. Pattern Recognit. 36(2), 371–381 (2003).
 7
JA Noble, Finding corners. Image Vis. Comput. 6(2), 121–128 (1988).
 8
L Liu, D Zhang, in IEEE International Conference on Image Processing (ICIP), 3. Palmline detection, (2005), pp. 269–272. doi:http://dx.doi.org/10.1109/ICIP.2005.1530380
 9
JY Gan, DP Zhou, in Signal Processing, 2006 8th International Conference On, 3. A novel method for palmprint recognition based on wavelet transform, pp. 1–7. Beijing, 2006.
 10
X yuli, Palmprint feature extraction based on low frequency distribution. Microcomput. Appl. 20(1), 40–43 (2011).
 11
MMM Fahmy, Palmprint recognition based on mel frequency cepstral coefficients feature extraction. Ain Shams Eng. J. 1(1), 39–47 (2010).
 12
Y Zhang, D Zhao, G Sun, Q Guo, B Fu, in Artificial Intelligence and Computational Intelligence (AICI), 2010 International Conference On, 1. Palm print recognition based on subblock energy feature extracted by real 2dgabor transform, pp. 124–128. Sanya, 2010.
 13
R Raghavendra, B Dorizzi, A Rao, GH Kumar, Designing efficient fusion schemes for multimodal biometric systems using face and palmprint. Pattern Recognit. 44(5), 1076–1088 (2011).
 14
P xin, R qiuqi, W yanxia, Palmprint recognition using gabor local relative features. Comput. Eng. Appl. 48(15), 34–38 (2012).
 15
GS Badrinath, N Kachhi, P Gupta, Verification system robust to occlusion using loworder zernike moments of palmprint subimages. Telecommun. Syst. 47(3–4), 275–290 (2011).
 16
G Lu, D Zhang, K Wang, Palmprint recognition using eigenpalms features. Pattern Recognit. Lett. 24(9–10), 1463–1467 (2003).
 17
H Sang, W Yuan, Z Zhang, in Advances in Neural Networks – ISNN 2009. Lecture Notes in Computer Science, 5552, ed. by W Yu, H He, and N Zhang. Research of palmprint recognition based on 2dpca (WuhanChina, 2009), pp. 831–838.
 18
W Zuo, K Wang, D Zhang, in Image Processing, 2005. ICIP 2005. IEEE International Conference On, 2. Bidirectional pca with assembled matrix distance metric, (2005), pp. 958–61. doi:http://dx.doi.org/10.1109/ICIP.2005.1530216
 19
X Pan, QQ Ruan, Palmprint recognition using Gabor featurebased (2d)2pca. Neurocomputing. 71(13–15), 3032–6 (2008).
 20
GM Lu, KQ Wang, D Zhang, in Machine Learning and Cybernetics, 2004. Proceedings of 2004 International Conference On, 6. Wavelet based independent component analysis for palmprint identification, (2004), pp. 3547–35506. doi:http://dx.doi.org/10.1109/ICMLC.2004.1380404
 21
X Wu, D Zhang, K Wang, Fisherpalms based palmprint recognition. Pattern Recognit. Lett. 24(15), 2829–38 (2003).
 22
M Ekinci, M Aykut, Gaborbased kernel pca for palmprint recognition. Electron. Lett. 43(20), 1077–9 (2007).
 23
R Raghavendra, A Rao, GK Hemantha, in International Conference on Advances in Computing, Control, Telecommunication Technologies. A novel three stage process for palmprint verification (Trivandrum, Kerala, 2009), pp. 88–92.
 24
X Wang, H Gong, H Zhang, B Li, Z Zhuang, in Pattern Recognition, 2006. ICPR 2006. 18th International Conference On, 3. Palmprint identification using boosting local binary pattern (IEEEHong Kong, 2006), pp. 503–6.
 25
A Kumar, H Shen, in 3rd Int Conference on Image and Graphics, ICIG20. Palmprint identification using palm codes (Hong Kong, pp. 258–261. http://dx.doi.org/10.1007/9783642040702_42
 26
Z Sun, T Tan, Y Wang, SZ Li, in Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference On. Ordinal palmprint represention for personal identification [represention read representation], (2005), pp. 279–84. http://dx.doi.org/10.1109/CVPR.2005.267
 27
A Kong, D Zhang, M Kamel, Palmprint identification using featurelevel fusion. Pattern Recognit. 39(3), 478–487 (2006).
 28
J Wei, W Jia, H Wang, DF Zhu, in Emerging Intelligent Computing Technology and Applications. Improved competitive code for palmprint recognition using simplified gabor filter, (2009), pp. 371–7. ISBN:3642040691 9783642040696.
 29
Z Khan, A Mian, Y Hu, in IEEE International Conference on Computer Vision. Contour code: Robust and efficient multispectral palmprint encoding for human recognition (Barcelona, 2011), pp. 1935–1942.
 30
A Kumar, D Zhang, Personal authentication using multiple palmprint representation. Pattern Recognit. 38(10), 1695–1704 (2005).
 31
W Li, J You, D Zhang, Texturebased palmprint retrieval using a layered search scheme for personal identification. Multimedia, IEEE Trans. 7(5), 891–8 (2005).
 32
A Kong, D Zhang, M Kamel, A survey of palmprint recognition. Pattern Recognit. 42(7), 1408–18 (2009).
 33
D Zhang, W Zuo, F Yue, A comparative study of palmprint recognition algorithms. ACM Comput. Surv. 44(1), 2–1237 (2012).
 34
J Kannala, E Rahtu, in Pattern Recognition (ICPR), 2012 21st International Conference On. Bsif: Binarized statistical image features (IEEE,Tsukuba, 2012), pp. 1363–6.
 35
R Raghavendra, C Busch, in Proceedings of the 2Nd ACM Workshop on Information Hiding and Multimedia Security, IH&MMSec ’14. Robust palmprint verification using sparse representation of binarized statistical features: a comprehensive study (ACM,New York, NY, USA, 2014), pp. 181–5.
 36
D Zhang, WK Kong, J You, M Wong, Online palmprint identification. Pattern Anal. Mach. Intell. IEEE Trans. 25(9), 1041–50 (2003).
 37
A Kumar, in Sixth Indian Conference on Computer Vision, Graphics Image Processing (ICVGIP). Incorporating cohort information for reliable palmprint authentication (Bhubaneswar, 2008), pp. 583–90.
 38
R Raghavendra, C Busch, Novel image fusion scheme based on dependency measure for robust multispectral palmprint recognition. Pattern Recognit. 44(6), 2505–221 (2014).
 39
H Lee, C Ekanadham, AY Ng, in Advances in Neural Information Processing Systems. Sparse deep belief net model for visual area v2, (2008), pp. 873–80. www.eecs.umich.edu.
 40
G Hinton, S Osindero, YW Teh, A fast learning algorithm for deep belief nets. Neural Comput. 18(7), 1527–54 (2006).
 41
P Vincent, H Larochelle, Y Bengio, PA Manzagol, in Proceedings of the 25th International Conference on Machine Learning. Extracting and composing robust features with denoising autoencoders (ACM,New York, NY, USA, 2008), pp. 1096–1103.
 42
BA Olshausen, DJ Field, Sparse coding with an overcomplete basis set: A strategy employed by v1?Vision Res. 37(23), 3311–25 (1997).
 43
JH van Hateren, A van der Schaaf, Independent component filters of natural images compared with simple cells in primary visual cortex. Proc. R. Soc. London. Series B: Biol. Sci. 265(1394), 359–66 (1998).
 44
AJ Bell, TJ Sejnowski, The independent components of natural scenes are edge filters. Vis. Res. 37(23), 3327–38 (1997).
 45
A Hyvarinen, J Hurri, PO Hoyer, Natural Image Statistics (Springer, Berlin, 2009).
 46
J Mairal, F Bach, J Ponce, G Sapiro, Online learning for matrix factorization and sparse coding. J. Mach. Learn. Res. 11, 19–60 (2010).
Acknowledgements
The authors extend their thanks to reviewers for the comments and suggestions which helped in improving the overall quality of this article. This work was funded by the EU 7^{th} Framework Program (FP7) under grant agreement n ^{o} 284862 for the largescale integrated project FIDELITY.
Author information
Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
RR conceived, designed, and performed experiments. RR and CB contributed to the writing of the manuscript. All authors read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Raghavendra, R., Busch, C. Texture based features for robust palmprint recognition: a comparative study. EURASIP J. on Info. Security 2015, 5 (2015). https://doi.org/10.1186/s136350150022z
Received:
Accepted:
Published:
Keywords
 Biometrics
 Palmprint
 Comparative study
 Texture features