Trends In Secure Personal Authentication Information Technology Essay

Published: November 30, 2015 Words: 3265

Biometric authentication technologies such as fingerprint scanning, voice authentication, face recognition, hand geometry authentication, signature and iris scanning is now playing a major role especially in the data security application. Biometric can eliminate the problem of forgotten passwords and to strengthen the security relative to passwords. Currently, some biometric cryptographic algorithms have been developed based on fingerprints, iris, face, voice, signature and palm prints. Comparison of some of the biometric identifiers based on seven factors is provided in Fig. 1.

The iris is the annular region of the eye bounded by the pupil and the sclera (white of the eye) on either side [6]. The visual texture of the iris is formed during fetal development and stabilizes during the first two years of life. The complex iris texture carries very distinctive information useful for personal recognition. The accuracy and speed of currently deployed iris-based recognition systems is promising and point to the feasibility of large-scale identification systems based on iris information. Each iris is distinctive and, like fingerprints, even the irises of identical twins are different. It is extremely difficult to surgically tamper the texture of the iris. Further, it is rather easy to detect artificial irises (e.g., designer contact lenses). Although, the early iris-based recognition systems required considerable user participation and were expensive, the newer systems have become more user-friendly and cost effective.

This recognition method uses the iris of the eye which is colored area that surrounds the pupil. Iris patterns are unique and are obtained through video based image acquisition system. Each iris structure is featuring a complex pattern. This can be a combination of specific characteristics known as corona, crypts, filaments, freckles, pits, furrows, striations and rings.

Fig. 3. Image of IRIS.The numbers in this figure correspond to, 1) eyelids, 2) eyelashes, 3) specular reflections, 4) pupil, 5) sclera, and 6) shadow caused by an eyelid.

The artificial duplication of the iris is virtually impossible because of unique properties[9] .The iris is closely connected to the human brain and it is said to be one of the first parts of the body to decay after the death. It should be therefore very difficult to create an artificial iris to fraudulently bypass the biometric systems if the detection of the iris liveness is working properly.

3. SECURING THE IRIS

A biometric system is essentially a pattern recognition system that operates by acquiring biometric data from an individual, extracting a feature set from the acquired data, and comparing this feature set against the template set in the database. Depending on the application context, a biometric system may operate either in verification mode or identification mode.

1. Verification:

In the verification mode, the system validates a person's identity by comparing the captured biometric data with their own biometric template(s) stored in the system database. Eg. Does this biometric data belong to victim/ concerned person?

2. Identification:

In the identification mode, the system recognizes an individual by searching the templates of all the users in the database for a match. Therefore, the system conducts a one-to-many comparison to establish an individual's identity without the subject having to claim an identity Eg. Whose biometric data is this?

4. EVALUATION

When it is time to use the biometric authentication, the degree of security is concerned. In this paper, we have discussed the various types of biometric authentication techniques. In this section, we will evaluate different techniques and find degree of security. There are various parameters with the help of which we can measure the performance of any biometric authentication techniques. These factors are described below.

Factors of Evaluation

1. False Accept Rate (FAR) and False Match Rate (MAR): The probability that the system incorrectly declares a successful match between the input pattern and a non-matching pattern in the database. It measures the percent of invalid matches. These systems are critical since they are commonly used to forbid certain actions by disallowed people[1].

2. False Reject Rate (FRR) or False Non-Match Rate (FNMR): The probability that the system incorrectly declares failure of match between the input pattern and the matching template in the database. It measures the percent of valid inputs being rejected.

3 Relative Operating Characteristic (ROC): In general, the matching algorithm performs a decision using some parameters (e.g. a threshold). In biometric systems the FAR and FRR can typically be traded off against each other by changing those parameters. The ROC plot is obtained by graphing the values of FAR and FRR, changing the variables implicitly. A common variation is changing the variables implicitly. A common variation is the Detection Error Tradeoff (DET), which is obtained using normal deviate scales on both axes. This more linear graph illuminates the differences for higher performances (rarer errors).

5. ENCRYPTION AND DECRYPTION

Encryption is the process of transforming information (referred to as plaintext) using an algorithm (called cipher) to make it unreadable to anyone except those possessing special knowledge, usually referred to as a key. The result of the process is encrypted information (in cryptography, referred to as ciphertext).

Decryption is the process of decoding data that has been encrypted into a secret format. Decryption requires a secret key or password.

There are of course a wide range of cryptographic algorithms in use. The following are amongst the most well known:

1.DES

The Data Encryption Standard is a cipher that operates on 64-bit blocks of data, using a 56-bit key. It is a 'private key' system. DES is a symmetric block cipher developed by IBM. The algorithm uses a 56-bit key to encipher/decipher a 64-bit block of data. The key is always presented as a 64-bit block, every 8th bit of which is ignored. However, it is usual to set each 8th bit so that each group of 8 bits has an odd number of bits set to 1.

The algorithm is best suited to implementation in hardware, probably to discourage implementations in software, which tend to be slow by comparison. However, modern computers are so fast that satisfactory software implementations are readily available. DES is the most widely used symmetric algorithm in the world [2].

The key length argument goes like this. Assuming that the only feasible attack on DES is to try each key in turn until the right one is found, then 1,000,000 machines each capable of testing 1,000,000 keys per second would find (on average) one key every 12 hours.Use of multiple length keys leads us to the Triple-DES algorithm, in which DES is applied three times. If we consider a triple length key to consist of three 56bit keys K1, K2, K3 then encryption is as follows:

Encrypt with K1ïƒ Decrypt with K2 ïƒ Encrypt with K3

Decryption is the reverse process:

Decrypt with K3 ïƒ Encrypt with K2 ïƒ Decrypt with K1

Setting K3 equal to K1 in these processes gives us a double length key K1, K2. Setting K1, K2 and K3 all equal to K has the same effect as using a single-length (56-bit key). Thus it is possible for a system using triple-DES to be compatible with a system using single-DES.

2.RSA

RSA is a public key algorithm invented by Rivest, Shamir and Adleman. The key used for encryption is different from (but related to) the key used for decryption. The algorithm is based on modular exponentiation. Numbers e, d and N are chosen with the property that if A is a number less than N, then (Ae mod N)d mod N = A.

This means that the user can encrypt A with e and decrypt using d. Conversely the user can encrypt using d and decrypt using e (though doing it this way round is usually referred to as signing and verification).

The pair of numbers (e,N) is known as the public key and can be published. The pair of numbers (d,N) is known as the private key and must be kept secret.

The number e is known as the public exponent, the number d is known as the private exponent, and N is known as the modulus. When talking of key lengths in connection with RSA, what is meant is the modulus length. An algorithm that uses different keys for encryption and decryption is said to be asymmetric. Anybody knowing the public key can use it to create encrypted messages, but only the owner of the secret key can decrypt them [10].

Conversely the owner of the secret key can encrypt messages that can be decrypted by anybody with the public key. Anybody successfully decrypting such messages can be sure that only the owner of the secret key could have encrypted them. This fact is the basis of the digital signature technique. Without going into detail about how e, d and N are related, d can be deduced from e and N if the factors of N can be determined. Therefore the security of RSA depends on the difficulty of factorizing N. Because factorization is believed to be a hard problem, the longer N is, the more secure the cryptosystem. Given the power of modern computers, a length of 768 bits is considered reasonably safe, but for serious commercial use 1024 bits is recommended. The problem with choosing long keys is that RSA is very slow compared with a symmetric block cipher such as DES, and the longer the key the slower it is. The best solution is to use RSA for digital signatures and for protecting DES keys. Bulk data encryption should be done using DES.

3.MD5

MD5 is a 128 bit message digest function. It was developed by Ron Rivest. MD5 is a hashing algorithm that takes a message of up to 264 bits and reduces it to a digest of 128 bits (16 bytes). The algorithm is a development of the MD4 algorithm invented by Ronald Rivest and announced in 1990. Unfortunately, MD4 was flawed, so Rivest made some revisions, and the resulting algorithm was christened MD5.

Any hashing (or digest) algorithm should be such that, given a digest and the corresponding message from which it was derived, it should be computationally infeasible to construct a different message with the same digest.

4.AES

This is the Advanced Encryption Standard (using the Rijndael block cipher) approved by NIST. AES cryptography algorithm are employed to encrypt and decrypt the identity data [6].

Distance metric such as hamming distance and Euclidean AES is based on a design principle known as a Substitution permutation network. It is fast in both software and hardware. Unlike its predecessor, DES, AES does not use a Feistel network.

AES has a fixed block size of 128 bits and a key size of 128, 192, or 256 bits, whereas Rijndael can be specified with block and key sizes in any multiple of 32 bits, with a minimum of 128 bits. The blocksize has a maximum of 256 bits, but the keysize has no theoretical maximum.

AES operates on a 4Ã-4 matrix of bytes, termed the state (versions of Rijndael with a larger block size have additional columns in the state). Most AES calculations are done in a special finite field.

The AES cipher is specified as a number of repetitions of transformation rounds that convert the input plaintext into the final output of ciphertext. Each round consists of several processing steps, including one that depends on the encryption key. A set of reverse rounds are applied to transform ciphertext back into the original plaintext using the same encryption key [13].

Iris feature Extraction

Cipher Text key

Iris code original

(Binary code)

Encryption

User identification data

Iris image Enrolment

Fig.3 Encryption Technique

5. Reed-Solomon Coding

No

Yes

Tested Image

Iris feature Extraction

If iris matches?

Exit

Iris Template Original

User Identification Data

Cipher Text Key

Decryption

The original concept of Reed-Solomon coding (Reed & Solomon 1960) describes encoding of k message symbols by viewing them as coefficients of a polynomial p(x) of maximum degree k-1 over a finite field of order N, and evaluating the polynomial at n>k distinct input points. Sampling a polynomial of degree k-1 at more than k points creates an over determined system, and allows recovery of the polynomial at the receiver given any k out of n sample points using (Lagrange) interpolation. The sequence of distinct points is created by a generator of the finite field's multiplicative group, and includes 0, thus permitting any value of n up to N

.

Fig.4 Decryption Technique

Using a mathematical formulation, let (x1, x2, ..., xn) be the input sequence of n distinct values over the finite field F; then the codebook C created from the tuplets of values obtained by evaluating every polynomial (over F) of degree less than k at each xi is

C={(f(x1),f(x2),….f(xn))|f€F[x],deg(f)<k},

where F[x] is the polynomial ring over F, and k and n are chosen such that 1 ≤ k ≤ n ≤ N.

As described above, an input sequence (x1, x2, ..., xn) of n=N values is created as

(0, α0 , α1 ,…. αN-2 )

where α is a primitive root of F. When omitting 0 from the sequence, and since αN-1 = 1, it follows that for every polynomial p(x) the function p(αx) is also a polynomial of the same degree, and its codeword is a cyclic left-shift of the codeword derived from p(x); thus, a Reed-Solomon code can be viewed as a cyclic code.

6.XOR-ECC Method:

This method uses the idea that certain logical operations can be used to obfuscate the iris information with a randomly generated key. Using XOR, we can wrap and unwrap the key with the iris code, and because of the random nature of the key, the security of the system is then ensured. However, since dissimilarities exist even between different pictures of the same iris, varying methods of error correction were used to recalculate the original key, which can be used for hashing and identification [10].

Key Encrypt =

Ecc(key) (XOR) iris1 (XOR) iris2

Key EC

Fig. 5. Scheme for XOR-ECC method

When authenticating a user, the iris is scanned to obtain the iris code iris2. This is then XORed with the secured Iris Code. If the iris is from the same person, the resultant bit string would ideally return the original key error correction decoding. Since iris1 and iris2 are not identical, there would be errors in this result where the codes differ, which can be resolved with the error correction code [8].

The hashed value of this resultant key is then compared to that of the original key to determine a match. Since the key in the XOR-ECC method will be hashed and used for comparison, the recomputed key after error correction has to be identical to the original key. Thus, the error correction used in this process has a heavy influence on the robustness of the system.

7. Repetition :

In repetition, the key was copied multiple times to fit the size of the iris code. Since the errors would not typically occur in bursts, the rounded average of these bits could then be effectively used to represent the original bit.

8.Hadamard And Reed Solomon Error Correction:

Hadamard error correction works well for random errors. To do this, we create an orthogonal Hadamard code Hc with 2k-1 columns and 2k rows [12].

The iris code is then split into groups of k bits - each group from binary to decimal and used to indicate a specific row of the Hadamard matrix. The -1s are then treated as 0s to allow for XOR. This new matrix, represented by various rows of the Hadamard matrix, is then used to XORed with the iris code.

When decoding, we find the dot product of each string of 2k-1 bits and every row of the Hadamard matrix. Since the Hadamard matrix is orthogonal, the row with the max value after the dot product should be the same row that was used to represent the code. Using this row, we can convert the decimal value back to the binary to recover the original key, k bits at a time.

Since Hadamard error correction code wasn't strong enough to completely recalculate the key, additional Reed Solomon codes were used since it is robust against burst errors that occur with obstruction of the iris by eyelashes, eyelids, etc.

9. Authentication Service In A U-City Environment

Many biometric information data items, such as fingerprint veins, left and right side iris, fingerprints, and facial features, are all available. Double or triple authentication is implemented by forcing the input of more than two biometric information items, by selecting them randomly from among several available biometric information items [5].

Once this method is implemented, confidentiality can be protected to a larger extent, although separate biometric devices are still required to obtain each biometric information item. This is not of great concern, however, because only the biometric recognition devices that are available will be used.

10.The Fuzzy Vault System:

The fuzzy vault is an example of a well-known biometric cryptosystem that can be used to securely bind private keys with biometric templates [7].

A reliable fuzzy vault system is based on local iris features. The multiple iris features from multiple local regions in a given iris image is extracted and exact values of the unordered set were then produced by using Clustering method. Since fuzzy vault systems require the alignment of biometric templates with input data, a shift-matching technique is applied for this purpose.

5.TEMPLATE MATCHING:

Hamming Distance:

The Hamming distance gives a measure of how many bits are the same between two bit patterns[4]. Using the Hamming distance of two bit patterns, a decision can be made as to whether the two patterns were generated from different irises or from the same one. In comparing the bit patterns, the Hamming distance (HD), is defined as the sum of disagreeing bits (sum of the exclusive-OR between the iris 1 and iris 2) over N, the total number of bits in the bit pattern. The derived equation for HD is ,

HD =Sum(Iris1(xor)Iris2) / N

Where,Iris1 is the feature stored as a template in Database image, Iris2 is the Input image.

6. PERFORMANCE ANALYSIS

From the analysis, we determined that the XOR-ECC method had the best overall accuracy rate if Hadamard and Reed Solomon error correction codes are both used. However, while this method yields a greater accuracy rate, we found that our particular method of implementation of XOR-ECC with Hadamard and Reed Solomon ran an average of 31 times slower than that of random projection. This may mean that random projection would be better to use to search through a large database of irises, where a 10% correct acceptance margin is sacrificed for a faster search algorithm.

Fig. 6: Comparison of different error correction techniques

7. CONCLUSION

While biometric authentication can offer a high degree of security, they are far from perfect solution. The risks of compromise of distributed database of biometrics used in security application are high- particularly where the privacy of individuals and hence non-repudiation and irrevocability are concerned. It is possible to remove the need for such distributed databases through the careful application of biometric infrastructure without compromising security. This paper study different iris based cryptographic algorithms and performance analysis. The vector feature from iris is extracted and then different algorithms are used for directly encrypting and decrypting the data. the XOR-ECC method using Hadamard and Reed Solomon error correction codes has the highest rate of correct accept (60%) and reject (100%). However, we also found random projection to run faster, which may make it better for searches through a database. Experimental results show that the Reed-Solomon error-correcting algorithm can work effectively compare to other techniques.