Furthermore, biometric identifiers can be copied or changed once they are in the computer so instead of protecting society from identity fraud and theft it just takes it to a new level. Biometric technology does not have 100 per cent accuracy. 63 million people travel through Heathrow airport each year. If biometric technology had 99 per cent accuracy there would still be 63,000 errors per year (Parliamentary Office of Science and Technology, 2001: 3). When using biometric identification technology positive identification is relied on the computer databases used. At this level of accuracy security, staff and passengers would lose faith in the system and not co-operate with its implementation.
A national ID card loaded with biometric information would help with identity fraud. However, whether or not a move to national ID card would help fraud or simply erode privacy is debatable. The rise in identity fraud has been directly linked to the erosion of people’s privacy due to technologies that allow companies and others to access varying degrees of personal information (Hatch, 2001: 1469). Indeed attempts at introducing a national card have been strongly resisted in the past due to public concern over privacy. On the other hand, passports with biometric data do not cause such civil alarm. Although civil libertarians remain troubled that passports utilize face recognition technology which means they can be used to track individuals form a distance without their knowledge (Wilson, 2007: 213).
Biometric identification is problematic; face recognition technology is only as accurate as the algorithm which is used. For counterterrorism, algorithms need to work on people who have changed their appearance, for example shaved a beard or used false eyeglasses. Furthermore, the recognition software must automatically correct for differences in lighting, expression and the angle of the image. There have been problems with the implementation of biometric identifiers in the US VISIT system, particularly with facial recognition technology where a number of ‘false hits’ on black and Asian faces led the Department of Homeland Security (DHS) to admit that they had ‘bought a lot of stuff off the shelf that wasn’t effective (“U.S. to Spend Billions More to Alter Security Systems” 2005)’.
Moreover, the politics surrounding the use of risk profiling as a means of governing mobility within the war on terror is problematic. The first step in developing any security measure is to compile a detailed threat profile establishing the characteristics of the danger (Schneier as cited in BILETA, 2005: 5). Thusly, the guiding assumption is that encoded risk profiles can be used as a basis to predict and prevent future acts of terrorism. However, there is an enormous potential for error and violation of international human rights standards within this system. Amoore argues that such risk profiling has serious implications for marginalised minority groups who have found that such citizen profiling is racial and ethnic targeting (Amoore, 2006: 346). Cole and Lynch rightly argue that biometric technologies ‘reproduce many of the racial and other forms of discrimination’ that characterise criminal justice practices and that the ‘suspect of the future may end up looking very much like the suspect of the past (Cole & Lynch, 2006: 56)’.
One of the major concerns of implementing biometric identification technologies is the erosion of privacy. Five biggest concerns of privacy advocates are; loss of freedom, loss of anonymity, commercial exploitation and storage of intimate information without a person’s consent or knowledge, coercion on the part of those in control of sensitive biometric data; and changes in the purpose for which data were originally collected (Peterson, 2007: 741). These concerns have serious implications for society. With no specific privacy laws relating to biometric identification and surveillance technologies it can be argued that contemporary society could very likely turn into a situation form Orwell’s 1984. Furthermore, those in control of biometric data would be exposed to greater risks of harm from the criminal element of society for the knowledge they posses.
Locke argued that as a citizen of a free and liberal state he should have ‘A liberty to follow my own will in all things where the rule prescribes not; and not to be subject to the…will of another man (Locke, 1988: 418)’. For Locke, using biometric technology individuals are constricted in the basic freedoms guaranteed by the constitution and denied an individual’s right to a private realm of thought and action. Such lack of privacy has serious implications for the functioning of society as constant surveillance would impact on the basic rights of freedom of speech and religious expression as people would deliberately censor their actions if they knew they were being monitored. Particularly if their private thoughts and actions could be recorded and used against them; such as identifying them as a security risk. McCullagh asks if this is a new era in which individuals have a right to privacy but ‘not necessarily to anonymity (BILETA, 2005: 2)’.
The use of biometric identification and surveillance technologies are comparable to Foucault’s ideas on the panopticon. The panopticon serves as a warning of what kind of biometric database surveillance society we must strive to avoid. Universal identifiers have been criticised as leading towards behavioural profiles of individuals based on controversial data-matching techniques (Shattuck as cited in BILETA, 2005: 6). This emphasises a move towards pre-emptive surveillance which has great implications for society. Such surveillance serves only to enhance racial and religious prejudices, particularly with a government which wants to anticipate, and stop, terrorist actions.
It is clear that biometric technology can be applied successfully in many areas of society. However, Clarke rightly argues that conducting a wider social policy impact assessment may be appropriate in order to determine how best to maintain civil liberty protections in the face of power challenges by the state as the asymmetrics produced and maintained in the operation of biometrics involve power issues that go beyond individual power concerns (Clarke, 2001). Issues around privacy cannot be ignored as biometric technology is one of the most serious among the many technologies of surveillance that are threatening the freedom of individuals and of societies. In the Australian context Advance Passenger Processing, the Movement Alert List, the Regional Movement Alert List, the biometric identification of asylum seekers and the introduction of the ePassport are constitutive and symbolic components of the securitization of mobility. These searchable databases and biometric identification systems engaged in the mission of social sorting at the border are not impartial. As Graham and Wood note of digital surveillance systems, while they may be characterized by flexibility and ambivalence, and contingent upon judgments of social and economic worth built into their design, they are ‘likely to be strongly biased by the political, economic and social conditions that shape the principles embedded in their design and implementation (2003: 229)’. Surveillance technologies and practices positioned within a frame of security and control diminish the spaces that human rights and social justice might occupy.
Bibliography
Alterman, A. (2003). ‘A Piece of yourself: Ethical issues in biometric identification’. Ethics and Information Technology. 5: 139-150.
Amoore, L. (2006). ‘Biometric borders: Governing mobilities in the war on terror’. Political Geography, 25: 336-51.
Clarke, R. (2001) ‘Biometrics and privacy’. Retrieved April 3, 2008, from <>
Cole, S.A. & M. Lynch. (2006). ‘The social and legal construction of suspects’. Annual Review of Law and Social Science. 2: 39-60.
Garfinkel, S. (2001). Database Nation: The Death of Privacy in the 21st Century. Cambridge: O’Reilly & Associates.
Graham, S. & D. Wood. (2003). ‘Digitizing surveillance: categorization, space, inequality’. Critical Social Policy, 23(2): 227-48.
Hatch, M. (2001). ‘The Privatisation of Big Brother: Protecting Sensitive Personal Information from Commercial Interests in the 21st Century’. William Mitchell Law Review. 27(3): 1457-1502.
Locke, J. (1988). Two Treatises of Government. Cambridge: Cambridge University Press.
Mc Cullagh, K. (2005, April). ‘Identity information: the tension between privacy and the societal benefits associated with biometric database surveillance’. Presented at the 2005 British and Irish Law, Education and Technology Association (BILETA) Conference. Retreived March 30, 2008, from <>
Parliamentary Office of Science and Technology. (2001). Postnote: Biometrics and Security. Retrieved April 8, 2008, from <>
Peterson, J.K. (2007). Understanding surveillance technologies: spy devices, privacy, history and applications (2nd ed.). Boca Raton: Auerbach Publications.
‘U.S. to Spend Billions More to Alter Security Systems’. (2005, May 8). New York Times.
Wilson, D. (2007). ‘Australian Biometrics and Global Surveillance’. International Criminal Justice Review. 17(3): 207-219.
The design for the panopticon involved a central tower surrounded by cells. The idea was to allow one guard to monitor the activities of all the prisoners, saving a lot of time and labour. Furthermore, the tower would contain dark glass so that the prisoner was never aware if they were being observed or not which regulated their behaviour.