Evaluation of the Parameters Involved in the Iris Recognition System
الموضوعات :
1 - M. Tech. Research Scholar, Department of Computer Science and Engineering, Dr. B R Ambedkar National Institute of Technology, Jalandhar, India
الکلمات المفتاحية: Canny Edge Detector, Biometric Recognition, Equal Error Rate, Gamma Correction, Circle Hough Transform,
ملخص المقالة :
Biometric recognition is an automatic identification method which is based on unique features or characteristics possessed by human beings and Iris recognition has proved itself as one of the most reliable biometric methods available owing to the accuracy provided by its unique epigenetic patterns. The main steps in any iris recognition system are image acquisition, iris segmentation, iris normalization, feature extraction and features matching. EER (Equal Error Rate) metric is considered the best metric for evaluating an iris recognition system.In this paper, different parameters viz. the scaling factor to speed up the CHT (Circle Hough Transform), the sigma for blurring with Gaussian filter while detecting edges, the radius for weak edge suppression for the edge detector used during segmentation and the gamma correction factor for gamma correction; the central wavelength for convolving with Log-Gabor filter and the sigma upon central frequency during feature extraction have been thoroughly tested and evaluated over the CASIA-IrisV1 database to get an improved parameter set. This paper demonstrates how the parameters must be set to have an optimized Iris Recognition System.
1. A. Jain, L. Hong and S. Pankanti, Biometric Identification. Communications of the ACM 2000, 43(2), p. 91-98. DOI 10.1145/328236.328110.
2. Iris recognition. Available online: Wikipedia https://en.wikipedia.org/wiki/Iris_recognition (9 Aug 2018).
3. S. Sanderson and J. Erbetta. Authentication for secure environments based on iris scanning technology. IEEE Colloquium on Visual Biometrics, 2000.
4. K. W. Bowyer, K. Hollingsworth and P. J. Flynn. Image understanding for iris biometrics: A survey. Computer Vision and Image Understanding, May 2008, vol. 110, pp. 281-307.
5. J. Daugman. How iris recognition works. IEEE Trans. Circuits Syst. Video Technol., January 2004, vol. 14, pp. 21-30.
6. R. P. Wildes. Iris recognition: an emerging biometric technology. Proceedings of the IEEE, 1997, pp.1348-1363.
7. J. Daugman. Biometric personal identification system based on iris analysis. United States Patent, Patent Number: 5,291,560, 1994.
8. L. Masek. Recognition of Human Iris Patterns. Bachelor of Engineering degree, School of Computer Science and Software Engineering, The University of Western Australia, 35 Stirling Hwy, Crawley WA 6009, Australia, 2003.
9. M. Vatsa, R. Singh, A. Noore. Improving iris recognition performance using segmentation quality enhancement match score fusion and indexing. IEEE Trans. Syst. Man Cybern. B Cybern, 2008, vol. 38, no. 4, pp. 1021-1035.
10. T. Marciniak, A. Dąbrowski, A. Chmielewska, A. Krzykowska. Selection of parameters in iris recognition system. Multimedia Tools And Applications, January 2014, vol. 68, no. 1, pp. 193-208.
11. Rupesh Mude & Meenakshi R Patel. Gabor Filter for Accurate IRIS Segmentation Analysis. International Journal of Innovations in Engineering and Technology, October 2015, Volume 6 Issue 1.
12. Ajay Kumar, Arun Passi. Comparison and combination of iris matchers for reliable personal authentication. Pattern Recognition, March, 2010, vol. 43, no. 3, p.1016-1026, [DOI 10.1016/j.patcog.2009.08.016]
13. T. Tan, Note on CASIA-IrisV1, Chinese Academy of Sciences' Institute of Automation, 03 2011. Available online: http://biometrics.idealtest.org/. (Accessed April 30, 2018).
14. R. Wildes, J. Asmuth, G. Green, S. Hsu, R. Kolczynski, J. Matey and S. McBride. A system for automated iris recognition. Proceedings of the Second IEEE Workshop on Applications of Computer Vision, Sarasota, FL, 1994.
15. R. Hamming. Error detecting and error correcting codes. The Bell System Technical Journal, 1950, vol. 29, no. 2, pp. 147-160.
16. Thiyaneswaran B., Padma S. Analysis of Gabor Filter Parameter for Iris Feature Extraction. IJACT, 2014 Volume 3, Number 5. pp: 45-48.
17. A. S. Al-Waisy, R. Qahwaji, S. Ipson and S. Al- Fahdawi. A Fast and Accurate Iris Localization Technique for Healthcare Security System. IEEE International Conference on Computer and Information Technology; Ubiquitous Computing and Communications; Dependable, Autonomic and Secure Computing; Pervasive Intelligence and Computing, 2015, pp. 1028–1034.
18. Identité numérique & authentification forte. Available online: https://fr.slideshare.net/smaret/identit-numrique-et-authentification-forte-1070073 (Accessed on 20, June 2018).
19. Canny edge detector. Available online: https://en.wikipedia.org/wiki/Canny_edge_detector. (Accessed on 30 April, 2018).
20. Gaussian Smoothing. Available online: https://homepages.inf.ed.ac.uk/rbf/HIPR2/gsmooth.htm (Accessed on 20 June, 2018).
21. Gamma correction. Available online: https://en.wikipedia.org/wiki/Gamma_correction. (Accessed on 30 April, 2018).
22. Log Gabor filter. Available online: https://en.wikipedia.org/wiki/Log_Gabor_filter. (Accessed on 30, April, 2018).
6
Journal of Advances in Computer Engineering and Technology
Minakshi Boruah
Received (Day Month Year)
Revised (Day Month Year)
Accepted (Day Month Year)
In this paper, different parameters viz. the scaling factor to speed up the CHT (Circle Hough Transform), the sigma for blurring with Gaussian filter while detecting edges, the radius for weak edge suppression for the edge detector used during segmentation and the gamma correction factor for gamma correction; the central wavelength for convolving with Log-Gabor filter and the sigma upon central frequency during feature extraction have been thoroughly tested and evaluated over the CASIA-IrisV1 database to get an improved parameter set. This paper demonstrates how the parameters must be set to have an optimized Iris Recognition System.
Index Terms- Biometric Recognition, Canny Edge Detector, CASIA-IrisV1, Circle Hough Transform, Equal Error Rate, Gamma Correction, Log-Gabor Filter, Non-Maxima Suppression.
I. INTRODUCTION
Biometric identifiers are the measurable and distinctive features that are used to identify, denote and store information about individuals [1]. A biometric system operates by first capturing a sample of any biometric feature, such as digital color or black & white image of a face which is further to be used in facial recognition and same is the case with voice recognition or finger-print recognition. The sample may then be refined so that the most important features can be extracted and noises in the sample are minimized thus improving upon the recognition. The sample is then converted into a biometric template using some sort of mathematical function. The biometric template is then normalized into an efficient representation of the sample which is then used for comparisons.
At present, increasing number of applications are using biometrics for identification, authentication and recognition. For example, banks for customers, corporate sectors for employees, colleges for student attendance and so on.
Iris recognition is a method of biometric identification, recognition and verification that uses pattern-recognition techniques based on high standard images of the irises of an individual's eyes [2]. Iris-based personal recognition is on trumps when compared with other biometric technologies such as speech, face and fingerprint recognition [3] because of its high accuracy (considered as the most accurate biometric technology available today), high recognition speed, non-intrusiveness and good stability [4–6].
In an iris recognition system various distinct steps are involved in the analysis of the human iris. These steps are image acquisition, iris segmentation (finding the location of the iris in the eye image acquired), iris normalization (mapping the details captured out of the iris ring into a fixed-size rectangular pattern to account for the possible inconsistencies in the size of irises), feature extraction (recording the details in the normalized patter into a feature vector/ template) and features matching (calculating the similarity degree between any two extracted features of iris).
Although prototype systems for iris recognition existed earlier, but actuation of a working model was done only in the 1990’s by Professor John Daugman (University of Cambridge) [7]. The Daugman’s system (1994) is the one first patented and the one that has been licensed to many commercial developers of the systems for iris recognition. A large number of researchers have tested the Daugman system and all have reported a zero failure rate. It’s claimed that the Daugman system can identify an individual perfectly with a probability of 10-8.
Near-infrared illumination and cameras were used and accepted as beneficial by Libor Masek [8]. Although, Masek did parametric analysis but his focus deviated away from the final result of better EER as he focused on parameters like number of degrees of freedom (DOF) etc.
Some did analysis of Gabor filter [9], some of particular angular span and length of the radius of iris [10]; some analyzed the frequency, scale and the orientation of Gabor filter for the iris feature extraction [11] and so on but most of them didn’t do so for reducing EER (but rather focused indirectly via FAR (False Acceptance Rate) or FRR (False Rejection Rate) or decidability index (DI) or GAR (Genuine Acceptance Ratio)) [9] [11] and some used other databases [10]. Moreover, the EER results were not satisfactory for the cases where EER was indeed considered [12]. In short, there is a rare possibility of finding a research with the parameters being explicitly tested for reducing EER.
EER (Equal Error Rate) metric is considered as the best metric for evaluating an iris recognition system. EER is the point where the false identification and false rejection rate meet and are combinedly minimal and optimal i.e. it is a compromise point between FAR and FRR. Hence a single point describes many graphs (e.g. Detection Error Trade-off (DET)). The values of FRR and FAR are both dependent on threshold. A high FAR will increase the risk of giving access to unauthorized personnel. On the other hand, a high FRR will result in a genuine user having problems in accessing as the probability of rejection is increased. As accuracy (defined in 3. section) is defined for all thresholds, hence accuracy and the EER are never simultaneously at their best for the same threshold value.
As per the market prediction, the global market for Iris Biometrics will reach US$ 1.8 billion by 2020. But, this doesn’t come as a surprise, as it is evident by its use in nearly every purpose related to a person’s identity and the developments in the cognitive Internet of things (CIoT) is going on. In the last decade the UK, US and United Arab Emirates (UAE) all have used, and are still using, iris recognition to identify potential fraudsters and other disturbances, which clearly shows its increasing role in security applications.
The purpose of this work is to do parametric analysis so that the system acts as per the particular need and is properly calibrated. This will produce an optimized system for various metrics like EER. (In this proposed work EER is reduced to 0.0076.) There were four parameters used during segmentation which were tested. Then there were two parameters belonging to feature extraction that were tested. Some of the parameters like scaling factor to fasten the Circle Hough Transform even though reduces EER and hence improves the results at the onset, but increases the running time and thus reduces the overall efficiency. Similar is the case with ‘Number of sides’, defining the circle boundary used in Normalization to draw circles by obtaining the coordinates of the circle.
II. Related Work
Wildes et al. [14] mentions that the first use of iris recognition for individual identification was in the late 1800's when the colour pattern of the irises of inmates in a Parisian prison were visually inspected to determine their identity.
Further in 1936, the idea of using iris patterns as a method to recognize an individual was proposed by Frank Burch. However, it was not until the end of the Second World War (2 September 1945) that ophthalmologists started writing more seriously about the possibility of using the iris patterns of the human eye as a method of identifying individuals. Libor Masek [8] used Canny edge detection along with Hough transforms to segment the eye images which were already proved beneficial by Wildes [14]. But he did a subjective manual evaluation of the segmentation step not an evaluation of results of the entire process. He used Daugman’s rubber sheet model [7] for translating from Cartesian to polar coordinates (normalization process). The separated and normalized iris was then encoded by convolving it using 1D Log-Gabor wavelets. Finally, he quantized the templates into iris code by phase quadrature where subsequent codes formed were 50% different. For the matching stage of iris recognition, Masek used the binary template, called the iris code which consisted of two-bits per pixel. The Hamming distance was used to calculate dissimilarity scores [15].
In Masek’s work the dependency of EER on various parameters was not studied at all, even though impact of parameters of Log-Gabor filter on Decidability and number of DOF was studied.
In work by Mayank Vatsa et. al. [9], neither parameters that control textural feature extraction using the 1-D Log Polar Gabor Wavelet nor parameters that control topological feature extraction using the Euler number have been analyzed. The sole analysis for accuracy was done for the parameters of the ellipses that were used in the two-stage iris segmentation using the proposed elliptical model for three databases used which were other than the CASIA-IrisV1 [13] database used in this current proposed work.
Ajay Kumar et. al. [12] combines phase encoding methods employed to get the texture information to achieve further improvement over the methods used singly for the extraction. The approaches tested were Log-Gabor, Haar Wavelet, DCT and FFT. During testing the centre frequency and bandwidth of Log-Gabor filter were the only parameters that were analyzed and the result was based on GAR and DI. The experiments using CASIA-IrisV1 database provided an EER of 0.94 with single training and 0.36 with two trainings.
In the work done by Thiyaneswaran et. al. [16] coefficients of the Gaussian filter and the Gaussian envelop were varied to reduce the computing time on the UBIRIS Version 2.0 iris data base images. The proposed Gabor wavelet reduced the feature extraction time taken at average to 141 Nano seconds. So, no effort was made for reducing EER.
In the research work by Al-Waisy et al. [17] the focus has been on health care systems where a high-security level is a must for protecting extremely sensitive records of patients. The major goal is to provide a secure access to the needed records at any time with high level of confidentiality maintained for patients. The pupil’s radius and center coordinates were employed for discarding the unnecessary edge points within the iris region in order to reduce the search time of the Hough transform. But the accuracy achieved was just 99.07% for CASIA Version 1.0 database (in this proposed work it is 99.34%). First of all, the role of various other parameters involved like the sigma upon central frequency, the central wavelength of Log-Gabor filter and likewise was not studied and secondly, the impact on EER was not studied at all. And the discussion on ‘Why EER is more important than accuracy’ was already done in the previous section.
Rupesh Mude et al. [11] used 2D-Gabor filter for feature extraction and did a detailed analysis of the impact of the frequency ω of the 2D-Gabor filter, scale of filter defining the Gabor in the spatial domain, the orientation of the filter on the decidability index. It discusses how these parameters may influence the filter performance but doesn’t give any insight on final result even in terms of accuracy. No discussion on EER was made.
In summary, few researchers have emphasized on improving EER along with improving on time and space and moreover few researchers have emphasized upon the important parameters like that of the filters used. Here in case of this research these parameters are: the sigma for blurring with Gaussian filter while detecting edges, the scaling factor to fasten the Hough transform, the gamma correction factor for gamma correction and the radius for weak edge suppression for the edge detector during segmentation; the sigma upon central frequency and the central wavelength for convolving with Log-Gabor filter during feature extraction. Hence, there is a rare possibility of finding a research with the parameters being explicitly tested for reducing EER. (In this proposed work EER is reduced to 0.0076).
Analysis of the Parameters Used in the Various Stages of Iris Recognition
Graphs were plotted as part of the methodology for evaluation of the parameters. Further analysis was done as is explained below. The results obtained when these optimum parameters are used with a modified CHT algorithm show real improvements. Metrics: FAR (False Acceptance Rate), FRR (False Rejection Rate), EER (Equal Error Rate), Accuracy, TPR (True Positive (Acceptance) Rate) and TNR (True Negative (Rejection) Rate) have been used to measure the efficiency of the proposed system and have been explained in this section.
FAR: FAR is defined as the number of false acceptance for each negative identification attempt.
FRR: FRR is defined as the number of false rejection for each positive identification attempt.
TPR = 1-FRR
TNR = 1-FAR
TP (Total number of correct acceptances) = TPR * Actual Number of positive identification attempt
TN (Total number of correct rejection) =TNR * Actual Number of negative identification attempt
Accuracy: Accuracy is defined as the total number of correct rejections and acceptances over the total number of attempts made to enter the system.
Fig. 1. False acceptance rate (FAR) and false rejection rate (FRR) as functions of the threshold t [18]
EER: By plotting FAR and FRR on a graph as functions of the threshold, one finds that there is a point where two curves generated by FAR and FRR crosses each other. That is, it is a compromise between FAR and FRR. The lower the EER, the better is the system. Fig. 1 shows the plot of FAR and FRR as functions of the threshold (say t).
Evaluation of Sigma for Gaussian Filter of Canny Edge Detector
The size of the Gaussian filter, which is the smoothing filter used in the first stage of the Canny edge detection [19] algorithm, directly affects its final results. Smaller filters cause less blurring and allow detection of small and fine lines.
The lower frequencies mean there are not a lot of changes in intensity. Low pass filters pass low frequencies. Edges (high frequencies) become smoother as one increases sigma because more detail is compromised and the extent of the pixels to be considered in the detail is reduced. As we decrease the sigma we get more details of the pixels.
When one increases the standard deviation in the normal distribution, the distribution spreads out more and the peak becomes less spiky, similarly if one increases the standard deviation, the image will be blurrier for a given Gaussian filter as sigma shows variation or spread at a peak. The equation for the Gaussian filter kernel (used in canny edge detection) (Filtering in the spatial domain) of size (2k+1) × (2k+1) is given by:
(1) [19]
Fig. 2. 2-D Gaussian distribution with mean (0, 0) and σ=1 [20]
Larger blurring size of the radius is more useful for detecting larger and smoother edges. Filter size larger than 3*sigma or any size is possible, but there is not much of a benefit from doing this. (Larger standard deviation Gaussians require larger convolution kernels in order to be accurately represented and the localization error to detect the edge increases.)
Fig. 3. Sigma for Gaussian filter of canny edge detector Vs EER
The degree of smoothing should be appropriate and decided based on whether that level of blurring helps to remove useless data and get the required detail or misses out the detail required because if it increased too much, the edge to be detected can get blurred and consequently the EER increases. So firstly smoothing proves beneficial and EER is observed as decreasing but beyond a particular point (which is optimal, here 2.5) it starts increasing as is depicted in the graphical plot of Fig. 3 above.
Evaluation of Radius for Non-Maxima Suppression
After blurring, edge extracted from the gradient value (rate of intensity change at each point in the image ( Fig. given below)) is quite blurred and hence wide.
Say, the detected edge is a 5px long edge, so now, if one wants that the location of the edge be marked by 1px wide line then the edge suppressing technique can be used which finds the "maximum" in the blurred-edge gradient and marks the middle pixel (edge thinning technique) as the actual edge.
The radius parameter used in Non-maxima suppression is the distance in pixel units to be looked at on each side of each pixel when determining whether it is a local maximum or not.
Thus non-maxima suppression helps to ignore all gradient values (sets them to 0) other than the local maxima which denotes locations with the widest change (an accurate response that must be marked as edges) of the intensity value. Instead of doing an explicit differentiation perpendicular to each edge a different kind of approximation is often used.
Fig. 4. Image containing feature normal/gradient orientation angles in degrees (0-180), angles positive anti-clockwise for knowing gradient direction
Fig. 5. Non-maxima suppression radius for canny edge detector Vs EER
A radius of 1.5 is deemed appropriate for the edge suppression because till that point non-maxima suppression is helpful but after which the EER starts increasing.
Evaluation of Gamma Correction Factor for Gamma Correction
Gamma correction [21], or often simply gamma, is, in the simplest cases, defined by the following power-law expression:
Vout = A (Vin) γ (2)
If in an electronic equipment like T.V. light intensity, I is related to the source voltage Vs as:
I ∝ (Vs) γ (3)
The inverse of the function above is:
I ∝ (Vs) 1/ γ (4)
And this relation (the inverse transfer function (gamma correction)) is used to compensate for the effect in the light intensity due to gamma function, as given in equation (3), so that the end-to-end response is linear. Gamma correction updates the contrast so that the output picture has the intended luminance and helps to segment and classify efficiently. Powers smaller than 1 make dark regions lighter ( Fig. above), while powers larger than 1 make the shadows darker. So, an appropriate gamma correction factor can ease the process of gamma correction to a high extent and this checking is done via a graphical plot of Gamma for Gamma Correction Vs EER ( Fig. 7).
Fig. 6. The effect of gamma correction on an image (original image) when the powers are varied [21]
Fig. 7. Gamma for gamma correction Vs EER
1.9 is taken as the optimum gamma correction factor as there EER is least.
Evaluation of Scale for Speeding the Hough Transform
Fig. 8. Scaling for speeding Hough transform Vs EER
Scaling has been used in this work for speeding up the Circle Hough Transform. Scaling works in the CHT by scaling down both upper and lower radiuses along with their difference. It has also been used in Canny Edge Detection for scaling down the whole image before the detection is done. Scaling, if increased, will no doubt give better results (i.e. reduce EER as depicted in the Fig. on next slide) but will also increase the space and time consumed.
The point 0.25 is taken as it is the middle point in the graph so that neither too much of scaling down nor lesser scaling does occur.
Evaluation of Centre Wavelength of Log-Gabor Filter
The frequency response of a 1-D Log-Gabor filter is given as [22]
G (f) = exp) (5)
where, f0 and σ are the parameters of the filter. f0 represents the centre frequency and σ represents the standard deviation of the Gaussian describing the log Gabor filter's transfer function in the frequency domain and affects the bandwidth of the filter.
Centre wavelength of the Log-Gabor filter [22] (inverse of central frequency) is a fixed constant used in the Log-Gabor filter and abstracts the central frequency around which frequency is being looked for in the texture. Increasing central wavelength decreases EER no doubt, but at a reduced frequency, it is observed that accuracy suffers. Also, minute decrease in EER shouldn’t deviate to a solution where only a few samples (meaning less frequency resolution) are used but this is at the cost of a higher spatial/temporal resolution (space more). So, a central wavelength of 11 ( Fig. 9) is chosen as the optimal value for an efficient and properly working system with respect to all metrics (TP/TN).
Fig. 9. Centre wavelength of Log-Gabor filter Vs EER
Evaluation of Sigma Upon the Filter Center Frequency for Log-Gabor Filter
Centre wavelength | Sigma for blurring with Gaussian filter | σ/f for Log-Gabor filter | Scaling for speeding Hough transform | Gamma for Gamma Correction | Non-maxima suppression radius |
11 | 2.5 | 0.45 | 0.25 | 1.9 | 1.5 |
Log-Gabor filter can locally represent frequency information [22]. The Gabor filter is a good method for simultaneously localizing spatial/temporal (value of the pixels of images as it is) and frequency information (rate at which the pixel values are changing in spatial domain). At some bandwidths, the Gabor filter has an effective DC component and thus it gives a feature that over-represents lower frequencies. Log-Gabor filter does not exhibit this problem.
In general, shape of the filter is determined by (σ/f where f is the central frequency, sometimes denoted f0). The Sigma appears in the Gaussian part only and has the same role (bandwidth of the filter) here as was described for the Sigma of the Gaussian filter of canny edge detector and the filter center frequency is a fixed constant which divides the sigma in the Log Gabor equation. f defines the frequency being looked for in the texture. By varying σ, we change the support of the basis or the size of the image region being analyzed i.e. width around the central frequency for feature extraction.
The graphical plot ( Fig. 10) for analysis and testing of the parameter further clarifies its effect. (EER gets reduced as the width of texture extraction increases and after a particular point when useless frequencies are extracted, it starts increasing).
The point 0.45 is taken as σ/f0 as there the EER is optimum.
Fig. 10. σ/f for Log-Gabor filters Vs EER.
III. Results
The vital effect of optimized parameters used in the different stages of an iris recognition system on the EER was observed in this paper. These parameters are the sigma for blurring with Gaussian filter while detecting edges, the scaling factor to fasten the Hough transform, the gamma correction factor for gamma correction and the radius for weak edge suppression for the edge detector during segmentation; the sigma upon central frequency and the central wavelength for convolving with Log-Gabor filter during feature extraction (Table 1). Further, as during the optimization of the parameters care was taken to not to adversely affect space and time hence an efficient Iris Recognition System resulted. The final optimized EER (of 0.0076) is evident in the DET graph of Fig. 11. So, the conclusion is made that a thorough analysis of various Parameters Involved in the Iris Recognition System helps to build up a system which apart from having good EER, also exhibits additional required properties (Table 2) and is thus optimized in its performance.
TABLE I
THE FINAL OPTIMIZED PARAMETER VALUES.
Fig. 11. DET graph obtained by plotting FAR vs FRR.
TABLE II
The final results along with EER of 0.007614.
EER | 0.007614 |
SPACE | 20-by- 480 i.e. 9600 bits for each eye image. |
TIME | 136 minutes on a PC. (1.80 GHz CPU and 4 GB RAM) |
ACCURACY | 99.34% |
IV. Discussion
In the future research should also focus on the parameters that have been discussed here as they play a vital role in obtaining the final result of better EER, time, space and accuracy. They will help in designing new mobile applications which need an optimized caliberation.
V. Conclusion
Although a vast amount of research work has been done in the field of iris recognition, there is still a dearth in research work exclusively for reducing the error rate by improving on the various parameters and this paper work presented a method to overcome this gap. Here in case of this research these parameters are: the sigma for blurring with Gaussian filter while detecting edges, the scaling factor to fasten the Hough transform, the gamma correction factor for gamma correction and the radius for weak edge suppression for the edge detector during segmentation; the sigma upon central frequency and the central wavelength for convolving with Log-Gabor filter during feature extraction. This proposed work reduced EER to a minimal value of 0.0076.
Acknowledgment
I thank all the faculty members of the Department of Computer Science and Engineering DR. B R Ambedkar National Institute of Technology, Jalandhar, India for their valuable suggestions. I would also like to thank my family for their continuous support and blessings during the whole period of the work.
Portions of the research in this paper use the CASIA-IrisV1 collected by the Chinese Academy of Sciences' Institute of Automation (CASIA).
References
1. A. Jain, L. Hong and S. Pankanti, Biometric Identification. Communications of the ACM 2000, 43(2), p. 91-98. DOI 10.1145/328236.328110.
2. Iris recognition. Available online: Wikipedia https://en.wikipedia.org/wiki/Iris_recognition (9 Aug 2018).
3. S. Sanderson and J. Erbetta. Authentication for secure environments based on iris scanning technology. IEEE Colloquium on Visual Biometrics, 2000.
4. K. W. Bowyer, K. Hollingsworth and P. J. Flynn. Image understanding for iris biometrics: A survey. Computer Vision and Image Understanding, May 2008, vol. 110, pp. 281-307.
5. J. Daugman. How iris recognition works. IEEE Trans. Circuits Syst. Video Technol., January 2004, vol. 14, pp. 21-30.
6. R. P. Wildes. Iris recognition: an emerging biometric technology. Proceedings of the IEEE, 1997, pp.1348-1363.
7. J. Daugman. Biometric personal identification system based on iris analysis. United States Patent, Patent Number: 5,291,560, 1994.
8. L. Masek. Recognition of Human Iris Patterns. Bachelor of Engineering degree, School of Computer Science and Software Engineering, The University of Western Australia, 35 Stirling Hwy, Crawley WA 6009, Australia, 2003.
9. M. Vatsa, R. Singh, A. Noore. Improving iris recognition performance using segmentation quality enhancement match score fusion and indexing. IEEE Trans. Syst. Man Cybern. B Cybern, 2008, vol. 38, no. 4, pp. 1021-1035.
10. T. Marciniak, A. Dąbrowski, A. Chmielewska, A. Krzykowska. Selection of parameters in iris recognition system. Multimedia Tools And Applications, January 2014, vol. 68, no. 1, pp. 193-208.
11. Rupesh Mude & Meenakshi R Patel. Gabor Filter for Accurate IRIS Segmentation Analysis. International Journal of Innovations in Engineering and Technology, October 2015, Volume 6 Issue 1.
12. Ajay Kumar, Arun Passi. Comparison and combination of iris matchers for reliable personal authentication. Pattern Recognition, March, 2010, vol. 43, no. 3, p.1016-1026, [DOI 10.1016/j.patcog.2009.08.016]
13. T. Tan, Note on CASIA-IrisV1, Chinese Academy of Sciences' Institute of Automation, 03 2011. Available online: http://biometrics.idealtest.org/. (Accessed April 30, 2018).
14. R. Wildes, J. Asmuth, G. Green, S. Hsu, R. Kolczynski, J. Matey and S. McBride. A system for automated iris recognition. Proceedings of the Second IEEE Workshop on Applications of Computer Vision, Sarasota, FL, 1994.
15. R. Hamming. Error detecting and error correcting codes. The Bell System Technical Journal, 1950, vol. 29, no. 2, pp. 147-160.
16. Thiyaneswaran B., Padma S. Analysis of Gabor Filter Parameter for Iris Feature Extraction. IJACT, 2014 Volume 3, Number 5. pp: 45-48.
17. A. S. Al-Waisy, R. Qahwaji, S. Ipson and S. Al- Fahdawi. A Fast and Accurate Iris Localization Technique for Healthcare Security System. IEEE International Conference on Computer and Information Technology; Ubiquitous Computing and Communications; Dependable, Autonomic and Secure Computing; Pervasive Intelligence and Computing, 2015, pp. 1028–1034.
18. Identité numérique & authentification forte. Available online: https://fr.slideshare.net/smaret/identit-numrique-et-authentification-forte-1070073 (Accessed on 20, June 2018).
19. Canny edge detector. Available online: https://en.wikipedia.org/wiki/Canny_edge_detector. (Accessed on 30 April, 2018).
20. Gaussian Smoothing. Available online: https://homepages.inf.ed.ac.uk/rbf/HIPR2/gsmooth.htm (Accessed on 20 June, 2018).
21. Gamma correction. Available online: https://en.wikipedia.org/wiki/Gamma_correction. (Accessed on 30 April, 2018).
22. Log Gabor filter. Available online: https://en.wikipedia.org/wiki/Log_Gabor_filter. (Accessed on 30, April, 2018).