|By Business Wire||
|May 1, 2014 07:06 PM EDT||
The cutting edge of biometric identification — using fingerprints or eye scans to confirm a person’s identity — isn’t at the FBI or the Department of Homeland Security. It’s in India.
India’s Aadhaar program, operated by the Unique Identification Authority of India (UIDAI) and created to confirm the identities of citizens who collect government benefits, has amassed fingerprint and iris data on 500 million people. It is the biggest biometric database in the world, twice as big as that of the FBI. It can verify one million identities per hour, each one taking about 30 seconds.
The program unnerves some privacy advocates with its Orwellian overtones, and the U.S.-based Electronic Frontier Foundation has criticized it as a threat to privacy.
But many developing countries see biometric identification (ID) as a potential solution for millions of citizens who don’t have any official and fraud-resistant ID. The Indian government distributes $40 billion a year in food rations, but fraud is rampant because most people lack proper ID. Analysts estimate that 40 percent of India’s food rations never reach the people they are intended to help. Indeed, a new study of initial results from India’s biometric program found that it both reduced corruption and was popular with beneficiaries.
India isn’t the only developing nation to explore biometric strategies. The Center for Global Development, a Washington-based think tank, reports that 70 nations have some sort of biometric program.
Now a Stanford business professor is proposing a way to make India’s program far more accurate. Lawrence Wein, a professor of management science, applies mathematical and statistical modeling to solve complex practical puzzles.
In health care, Wein has analyzed strategies to optimize food aid in Africa and to mitigate the toll of pandemic influenza. In homeland security, he has developed strategies that the U.S. government has adopted for responding to bioterrorist attacks involving smallpox, anthrax, and botulism.
Wein’s interest in biometrics started almost a decade ago, with his analysis of fingerprint strategies used by the Department of Homeland Security’s US-VISIT program for nonresidents entering the country. That analysis influenced the government’s decision to switch from a two-finger to a 10-finger identification system.
For Indian officials, the big practical challenge has been to make the program more accurate without getting bogged down when used by a billion people.
The system has to be accurate enough to spot all but about one in 10,000 imposters. But it shouldn’t be so foolproof that it falsely rejects large numbers of people who are who they claim to be. Nor should it take so long that people have to wait in long lines. If either of those things happened, few people would sign up. Participation in India’s program is voluntary, not mandatory.
India’s system is sophisticated. When a person first enrolls, scanners take image data for all 10 fingers and both irises. When people show up at a local office to receive a benefit, they get scanned again. That data is then sent to the central database, which compares it to the person’s original enrollment data.
But comparisons are complicated. One problem is that the scanning equipment where people first enroll is usually more expensive and sophisticated than the equipment at local government offices. That sets the stage for a lot of false rejections. Making matters more difficult, fingerprints and even irises vary tremendously in how distinctive they are.
The tradeoff is between accuracy and speed. Comparing all 10 fingerprints, or both irises, is extremely accurate, but it takes about 107 seconds. That may sound lightning fast, but it isn’t for a system that is supposed to perform 1 million verifications an hour. To speed up the process, Indian officials originally compared only a person’s right thumbprint. But a single thumbprint — or any other individual fingerprint — may be too hazy to compare. Indian officials then latched on to the idea of picking a person’s best fingerprint — the one that provides the easiest match. Results were better, but not ideal.
Wein teamed up with two graduate researchers, Apaar Sadhwani and Yan Yang, to derive and test sophisticated algorithms based on the Indian biometric data. Wein didn’t charge for his work, but he thought that it might have ramifications for many other governments, as well as for commercial companies. Indeed, banks in India are already developing their own applications for the Aadhaar system.
The researchers’ solution, which Indian officials are studying at the highest levels, is to focus on a particular subset of each person’s fingerprints and eye scans that are the easiest to compare to those originally scanned. The combination of fingerprints and iris data will vary from person to person. For some people, it could be just the right index finger. For others, it could be an index finger and a thumb. Or, it could be the irises, or a combination of fingerprints and irises.
For many people, as it turned out, an easy check of only one or two fingerprints is enough for an accurate identity confirmation. For about 37 percent of people, it’s necessary to compare just the irises. And for a very small number of people, it’s necessary to compare both irises and some fingerprints.
By spending a small amount of time on most people, and more time on a minority of others, the researchers found they could keep the average verification time to just 37 seconds. That’s a bit longer than it takes to just compare one finger, but the rate of false rejections is about 200,000 times lower.
Wein doesn’t expect the United States to replicate the Indian approach. Americans are already suspicious about government surveillance, and most Americans already carry drivers’ licenses and other photo identification that are fairly hard to forge. But for low-income countries, he says, biometrics may have a big future.
The paper, “Analyzing Personalized Policies for Online Biometric Verification,” was published by PLOS ONE on May 1, 2014.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
Mar. 26, 2017 12:30 AM EDT Reads: 1,830
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
Mar. 26, 2017 12:30 AM EDT Reads: 5,147
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
Mar. 26, 2017 12:30 AM EDT Reads: 2,870
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
Mar. 26, 2017 12:15 AM EDT Reads: 709
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, will present an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He will expound on the industry issues he frequently came up against as an analyst, and...
Mar. 26, 2017 12:00 AM EDT Reads: 4,103
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Mar. 26, 2017 12:00 AM EDT Reads: 1,689
ChatOps is an emerging topic that has led to the wide availability of integrations between group chat and various other tools/platforms. Currently, HipChat is an extremely powerful collaboration platform due to the various ChatOps integrations that are available. However, DevOps automation can involve orchestration and complex workflows. In his session at @DevOpsSummit at 20th Cloud Expo, Himanshu Chhetri, CTO at Addteq, will cover practical examples and use cases such as self-provisioning infra...
Mar. 25, 2017 11:15 PM EDT Reads: 2,912
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Jared Parker, Director of Financial Services at Kinetica, will discuss how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich inf...
Mar. 25, 2017 10:45 PM EDT Reads: 3,615
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Mar. 25, 2017 09:45 PM EDT Reads: 3,556
Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, represent...
Mar. 25, 2017 08:45 PM EDT Reads: 5,974
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Mar. 25, 2017 08:45 PM EDT Reads: 2,759
Things are changing so quickly in IoT that it would take a wizard to predict which ecosystem will gain the most traction. In order for IoT to reach its potential, smart devices must be able to work together. Today, there are a slew of interoperability standards being promoted by big names to make this happen: HomeKit, Brillo and Alljoyn. In his session at @ThingsExpo, Adam Justice, vice president and general manager of Grid Connect, will review what happens when smart devices don’t work togethe...
Mar. 25, 2017 06:15 PM EDT Reads: 2,562
SYS-CON Events announced today that Ocean9will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Ocean9 provides cloud services for Backup, Disaster Recovery (DRaaS) and instant Innovation, and redefines enterprise infrastructure with its cloud native subscription offerings for mission critical SAP workloads.
Mar. 25, 2017 05:15 PM EDT Reads: 1,935
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), will provide an overview of various initiatives to certifiy the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldw...
Mar. 25, 2017 04:00 PM EDT Reads: 510
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle.
Mar. 25, 2017 04:00 PM EDT Reads: 2,831