- Details
- Written by Mathew Thomas
- Category: Uncategorised
- Hits: 8207
UIDAI's Tragi-comedy of Claimed Unintended Consequences - Serendipity?
A 'Times of India' headline yesterday claimed, "Aadhaar helps 16 mentally challenged boys get back home".
The URL (link to) of the news report is below.
Why are these "tall" and stupid claims being made? Perhaps, the claims are those who are so mentally challenged that they cannot understand basics of biometrics, refuse to study or examine anything before prouncing opinions.
That may be an explanation of how the claims come to be made, but do not answer to the question of their motives.
A statement attributed in the news report to one, R Nagarathna, superintendent of the government home is, "If the word, 'duplicate' pops up during the regisration process, it means the person has already registered under Aadhaar".
She obiviously, has no clue of what a "duplicate" is ignorant that biometric identification does not locate duplicates, but throws up millions of false matches. This is mathematically proven and scientifically known.
Further, biometrics of children are yet undeveloped and hence of poor quality for use in biometric identification systems. That's why Section 5 of the Aadhaar Act provides for special measures for over 80 % of the population, including children.
So, the news report of people living in a make-believe world of biometric fancy tales is yet another devious attempt claim virtues of a scheme where none exist.
That media publishes such tripe is indicative of its ignorance and motivation.
https://timesofindia.indiatimes.com/city/bengaluru/aadhaar-helps-16-mentally-challenged-boys-get-back-home/articleshow/65662720.cms
Aadhaar helps 16 mentally challenged boys get back home | Bengaluru News - Times of India BENGALURU: Nearly three months after he went missing from his home in Damoh district of Madhya Pradesh, 17-year-old Arun Tiwari with severe intellectu. |
- Details
- Written by Mathew Thomas
- Category: Uncategorised
- Hits: 5405
Now Hafeez Saeed Can Vote in Indian Elections & His Party Contest!
It's great news. The Election Commission is planning to link - they call it "integrate" - the Sham ID , called, "Aadhaar" with voter lists.
Since, the Shamd ID is for all residents, not for citizens alone, Hafeez Saeed can not only vote, his party can contest and win elections.
There are only two that are infinite - God and our stupidity.
Here is the link to the news.
- Details
- Written by Mathew Thomas
- Category: Uncategorised
- Hits: 3523
Thursday October 7th 2010
Biometrics
The Difference Engine: Dubious security
Oct 1st 2010, 8:22 by N.V. | LOS ANGELES
THANKS to gangster movies, cop shows and spy thrillers, people have come to think of fingerprints and other biometric means of identifying evildoers as being completely foolproof. In reality, they are not and never have been, and few engineers who design such screening tools have ever claimed them to be so. Yet the myth has persisted among the public at large and officialdom in particular. In the process, it has led—especially since the terrorist attacks of September 11th 2001—to a great deal of public money being squandered and, worse, to the fostering of a sense of security that is largely misplaced.
Authentication of a person is usually based on one of three things: something the person knows, such as a password; something physical the person possesses, like an actual key or token; or something about the person’s appearance or behaviour. Biometric authentication relies on the third approach. Its advantage is that, unlike a password or a token, it can work without active input from the user. That makes it both convenient and efficient: there is nothing to carry, forget or lose.
The downside is that biometric screening can also work without the user’s co-operation or even knowledge. Covert identification may be a boon when screening for terrorists or criminals, but it raises serious concerns for innocent individuals. Biometric identification can even invite violence. A motorist in Germany had a finger chopped off by thieves seeking to steal his exotic car, which used a fingerprint reader instead of a conventional door lock.
Another problem with biometrics is that the traits used for identification are not secret, but exposed for all and sundry to see. People leave fingerprints all over the place. Voices are recorded and faces photographed endlessly. Appearance and body language is captured on security cameras at every turn. Replacing misappropriated biometric traits is nowhere near as easy as issuing a replacement for a forgotten password or lost key. In addition, it is not all that difficult for impostors to subvert fingerprint readers and other biometric devices.
Biometrics have existed since almost the beginning of time. Hand-prints that accompanied cave paintings from over 30,000 years ago are thought to have been signatures. The early Egyptians used body measurements to ensure people were who they said they were. Fingerprints date back to the late 1800s. More recently, computers have been harnessed to automate the whole process of identifying people by biometric means.
Any biometric system has to solve two problems: identification ("who is this person?") and verification ("is this person who he or she claims to be?"). It identifies the subject using a “one-to-many” comparison to see whether the person in question has been enrolled in the database of stored records. It then verifies that the person is who he or she claims to be by using a “one-to-one” ncomparison of some measured biometric against one known to come from that particular individual.
Scanning the fibres, furrows and freckles of the iris in the eye is currently the most accurate form of biometric recognition. Unfortunately, it is also one of the most expensive. Palm-prints are cheaper and becoming increasingly popular, especially in America and Japan, where fingerprinting has been stigmatised by its association with crime. Even so, being cheap and simple, fingerprints remain one of the most popular forms of biometric recognition. But they are not necessarily the most reliable. That has left plenty of scope for abuse, as well as miscarriage of justice.
The eye-opener was the arrest of Brandon Mayfield, an American attorney practicing family law in Oregon, for the terrorist bombing of the Madrid subway in 2004 that killed 191 people. In the paranoia of the time, Mr Mayfield had become a suspect because he had married a woman of Egyptian descent and had converted to Islam. A court found the fingerprint retrieved from a bag of explosives left at the scene, which the Federal Bureau of Investigation (FBI) had “100% verified” as belonging to Mr Mayfield, to be only a partial match—and then not for the finger in question.
As it turned out, the fingerprint belonged to an Algerian national, as the Spanish authorities had insisted all along. The FBI subsequently issued an apology and paid Mr Mayfield $2m as a settlement for wrongful arrest. But in its rush to judgment, the FBI did more than anything, before or since, to discredit the use of fingerprints as a reliable means of identification.
What the Mayfield case teaches about biometrics in general is that, no matter how accurate the technology used for screening, it is only as good as the system of administrative procedures in which it is embedded. That is also one of the finding of a five-year study (“Biometric Recognition: Challenges and Opportunities”) published on September 24th by the National Research Council in Washington, DC.
The panel of scientists, engineers and legal experts who carried out the study concludes that biometric recognition is not only “inherently fallible”, but also in dire need of some fundamental research on the biological underpinnings of human distinctiveness. The FBI and the Department of Homeland Security are paying for studies of better screening methods, but no one seems to be doing fundamental research on whether the physical or behavioural characteristics such technologies seek to measure are truly reliable, and how they change with age, disease, stress and other factors. None looks stable across all situations, says the report. The fear is that, without a proper understanding of the biology of the population being screened, installing biometric devices at borders, airports, banks and public buildings is more likely to lead to long queues, lots of false positives, and missed opportunities to catch terrorists or criminals.
What is often overlooked is that biometric systems used to regulate access of one form or another do not provide binary yes/no answers like conventional data systems. Instead, by their very nature, they generate results that are “probabilistic”. That is what makes them inherently fallible. The chance of producing an error can be made small but never eliminated. Therefore, confidence in the results has to be tempered by a proper appreciation of the uncertainties in the system.
On the technical side, such uncertainties may stem from the way the sensors were calibrated during installation, or how their components degrade with age. Maybe the data get corrupted by inappropriate compression, or by bugs in the software that surface only under sporadic conditions. The sensors may be affected by humidity, temperature and lighting conditions. Effects may be aggravated by the need to achieve interoperability between different proprietary parts of the system. There are endless ways for performance to drift out of true.
On the behavioural side, uncertainties may arise from an incomplete understanding of the distinctiveness and stability of the human traits being measured. The attitude of people using the system may affect the results. So will their experience with, or training for, such scanning equipment.
Whatever, if the likelihood of an impostor or wanted criminal showing up is rare, even recognition systems that have very accurate sensors can produce a lot of false alarms. And when a system generates a fair number of false positives relative to the remote possibility of a true positive, operators will inevitably become lax. That is a fact of life. And when that happens, it defeats the whole objective of having a screening process in the first place.
The body of case law on the use of biometric technology is growing, with some recent cases asking serious questions about the admissibility of biometric evidence in court. Apart from privacy and reliability, biometric recognition raises important issues about remediation. Increasingly, we can expect the courts to use remediation as a way of addressing both lax and fraudulent use of biometrics, especially for individuals (like Mr Mayfield) who have been denied their due rights because of an incorrect match or non-match in some screening process.
The biometrics industry has a vital role to play in these threatening times. But it would win broader acceptance if it paid greater attention to the concerns and cultural values of the people being scanned. And everyone would be better served if a good deal more was known about what it is, biologically, that makes each and everyone of us a unique human being.
Safran may care to consider the experience of GMAC, the organisation that represents 1,800 business schools worldwide. Faced with impostors taking their entrance exam, GMAC spent two years using flat print fingerprinting technology to verify people's identity. It didn't work, they've dropped it and now they're trying palm vein biometrics.
Meg Hillier MP, here in the UK, delights in upbraiding the government for cancelling ID cards and cancelling plans to put fingerprints on British passports. Is she right? Or are GMAC right?
Meanwhile the UK Border Agency has deployed smart gates to 10 airports. These gates compare your face with the picture in your passport. When tested in Manchester, there were so many false negatives -- the machines said that you are not you -- that they had to drop the matching tolerance to 30%. At that point, according to one biometrics expert, the machines couldn't distinguish between Osama bin Laden and Winona Ryder.
Once we've convinced UKBA to stop wasting our money, there is a second matter to consider. Note that the tolerance level can be varied by the user, whether Manchester airport or the FBI or ... The identity they ascribe to you is discretionary. They can assert that you are you or that you are not you, whichever, depending only on how they set the machines. That is not how we usually think of personal identity.
At the root of it all lies a mathematical conundrum (Bayes' Theorem which bedevils many other fields as well): You can't tell how effective any biometric device is unless you know the rate at which people are trying to subvert it.
For air travel, this is definitely not a known quantity to the sort of precision needed for the US TSA (or any other country for that matter) to properly evaluate their equipment. Absent this rather key number they can claim whatever "effectiveness" they wish to justify buying these machines, and no one can prove otherwise.
However, when combined with TSA's terrorist "Watch List" numbers, Bayes' theorem does lead to a rather interesting result:
a) If the nearly million strong "watch list" is full of real terrorists who regularly travel, then these machines are nigh near useless.
b) If the machines are working well, then there can't be that many terrorists running about, and the watch list should be pared down by a factor of 20-50 or so to be effective.
Fundamental math then proves TSA can't have it both ways, but I doubt that will stop them. They'll continue to buy the equipment and expand the watch list. Such is the nature of bureaucracy.
My money's on the machines. If there's really a million active terrorists hitting the airways regularly, then we're doomed anyway.
The primary form of biometrics, historically and today, is face recognition, generally performed by a human. Every driver's license and passport carries a picture of a person's face. We also do voice recognition (does he sound like an Italian?) routinely. So we crossed this threshold long ago. The question is making it better.
The trick with all these things is, as the author points out, understanding the statistics. No single biometric scheme is foolproof, but if you use several independent means, the likelihood of getting it wrong plummets.
That said, the caution against a rush to judgment is very well taken. Even if the chance of two people having the same iris-scan were one in 100 million, you can expect one pair of matching people per town of 10,000. (there are 10,000 x 9,999 pairs ~= 100 million possible pairs of people).
Yes, fingerprints are easily forged ... or lost, if you have a particular medical condition like I do: http://wp.me/ppqxP-9j
The biometrics industry has been dishonest from the start. Biometrics was a term introduced in, I think, in 1947, by Ronald Fisher and Sewall Wright to describe the use of mathematics and statistics in Biology. But, since no-one thought it necessary to register the name, it was easy for Gates et al to steal it. One should never trust intellectual thieves.
An unfortunate side effect of the probabilistic nature of results from a biometric system is the inability to securely hash the biometric data prior to storage.
Biometrics and chip/data technology identification and in particular RFID technology is a lot less safe than "old" ways of identification.
True, it is possible to copy a passport, but difficult. Such skills are less and less active now. What is easier is to fake an RFID identification, as you can much more easily fake data than actual paper ID. Especially considering that all data is stored centrally and privacy policies in general are so bad on-line that forging pictures and such is very easy.
There are many cases of catching wireless data, many cases of central data being lost/stolen, and many cases of forged data and security breaches. Considering all this it will be exceptionally easy for people in the future to copy fingerprints, photos and other information to make a fake electronic ID, rather than having to actually show a passport and yourself to a person who will verify the reality of it.
Nothing beats the humans, certainly not machines. Yet. I will not be trusting biometrics and data security.
"The FBI subsequently issued an apology and paid Mr Mayfield $2m as a settlement for wrongful arrest."
If FBI will pay USD 2m to everyone who will be wrongfully identified, I actually think the system would be quite popular.
Mr Mayfield and many others will never forget what they felt when they were humiliated and arrested by people who believed probabilities of 30% - 50%. I encourage you to buy the most expensive fingerprint reader you can find and have a go at it with a few dozen people...
I'll tell you what, since we work in a fascinating and very revealing industry, we'll use your login credentials, your IP address and a simple phone registry lookup along with a few gov databases to bring the numbers within a reasonable margin of error and then red flag you and a few of your family members on our systems during one of the next 10 international flights you decide to take.
Perhaps you'll be visiting Ireland or the UK again... Yes, we already performed a cursory check and can confirm that you even lived there. We do this regularly to test our technology and improve the probabilities cause.. 10%, 20% or 30% is still a lot better than zero percent.
Likely scenario:
During one of your travels, you will be taken aside and asked some very personal and revealing questions such as why you owned a Porsche boxster but drive a VW Jetta instead, what you carried in such a car and where you drove to on certain dates and then you will be asked these questions again and then again in a different. non-disclosed location. Let's see how you feel about the implementation of math probabilities and unreliable technology after such an experience... are you really ready for that?
About Babbage
In this blog, our correspondents report on the intersections between science, technology, culture and policy.
- Details
- Written by Mathew Thomas
- Category: Uncategorised
- Hits: 3541
Sham ID (SID) / "Aadhaar" Exposed Every day
The Sham ID / "Aadhaar" masquerading as, unique ID (UID), is being exposed every day; and yet there is mindless, partisan support for it.
This is the bane of this country. There is a culture, my party or group or community or religion – right or wrong.
The nation does not matter. The truth does not matter. Corruption does not matter. Harm, even to oneself, does not matter, because it is not perceived and anticipated.
Two pieces of writing both excellent ones, one titled, “Why Why Aadhaar cannot deliver anything it promises” and the other, “Over to Supreme Court” appeared in the papers today.
While these articles cover many important aspects of the Sham ID(SID) called, "Aadhaar", they have not touched upon somethings, more fundamental to it.
There are two false assumptions on which, Sham ID / "Aadhaar" is based.
1. Sham ID / "Aadhaar" provides unique IDs. It does not because biometric identification is inherently fallible.
2. identification and database control are needed to target subsidies / benefits and government services.
The first assumption is proven false both by research and mathematics using published specs of UIDAI. Ground data of large-scale duplicates allegedly detected (In reality, they are False Positive Matches.) confirms the futility of using biometrics for identification.
Secondly, in every subsidy, benefit or other entitlement, there is eligibility criteria to be determined. Hence, IDs alone are of no use. The false notion is being fostered that SID / "Aadhar" will work the magic of eliminating fraud in government-citizen interaction.
Thirdly, setting up large databases and keeping them updated is an impossible task.
Ignorance in these three areas is the crux of the difference between developed nations which, do not indulge in this type of tom foolery.
Here are the links to the articles referred to above.
- Details
- Written by Mathew Thomas
- Category: Uncategorised
- Hits: 3786
The Fifth “Gang”
In his article titled, “When you give your biometrics to Modi, Manu Joseph, quotes Nilekani as telling him that opponents of “UID / "Aadhaar" can be divided into four gangs – ‘the privacy gang’, ‘the-rights-of-the-poor’ gang, ‘the-oh-my-god-1984-has-arrived’ gang and ‘the luddites’ gang.
Perhaps, Nilekani did not learn the English properly and Manu Joseph did not either choose to correct him of acquiesced in what he said; for the article does not give any response that Joseph gave to such an accusation which, is in poor taste.
“Gang” means a group of organised criminals!
The great IT honcho who “rethinks” India says, anyone who differs from his think must belong to an organised gang of criminals.
Mercifully, he did not call luddites a gang; or was that a typo error?
I do not belong to any of these gangs, although Nilekani may consider me too as a criminal since, I disagree with him much more than any in the “gangs” he referred to.
I have incontrovertible, documentary evidence to show that:
- Biometrics is fallible, error prone and hence, does not provide a unique ID
- By its own admission (not to the public or government, but to its private contractors) said that the demographic data in the UID database is unreliable
- UID / "Aadhaar" is a mere number, but the lie is constantly repeated calling it “Aadhaar card”
- UID / "Aadhaar" is for ALL RESIDENTS and NOT citizens alone; hence it cannot and should NOT be used for targeting services
- Contrary to repeated assurances (including the one by Shri. Ravi Shankar Prasad – Hon'ble Minister for IT yesterday in the Rajya Sabha) that your data is safe, the entire data – biometrics and demographic is handed over to foreign private companies under UIDAI’s contract with them
Could handing over the entire biometric and demographic data of the people of the country linked to all databases be called a crime?
If so, what crime is this?
And to which “gang” such people belong to?
Joseph and ALL of media are ignorant of what I have said (No - they are not ignorant, since I have told many of them - even the passionate TV patriots among them) or have chosen to remain studiously silent of this crime and the "gang" behind it.
The link to Joseph's article is here.
http://www.livemint.com/Leisure/28JmBgGLgWLclLxQtzd9KI/When-you-give-your-biometrics-to-Modi.html
This email address is being protected from spambots. You need JavaScript enabled to view it." width="100" alt="" /> | When you give your biometrics to Modi - Livemint It is amusing that Aadhaar has remained optional for the poor, and has become mandatory for people whose survival does not depend on the government |