IDSA COMMENT

You are here

Facial Recognition Technology and Counter-Terror Operations

Lt Col Akshat Upadhyay is Research Fellow, Strategic Technologies Centre at Manohar Parrikar Institute for Defence Studies and Analyses, New Delhi. Click here for detailed profile
  • Share
  • Tweet
  • Email
  • Whatsapp
  • Linkedin
  • Print
  • August 05, 2022

    The ‘Artificial Intelligence in Defence (AIDef)’ symposium and exhibition, the first of its kind held on 11 July 2022, showcased 75 products based on Artificial Intelligence (AI), in keeping with the theme of ‘Azadi ka Amrit Mahotsav’ celebrating 75 years of India’s independence.1 The Armed Forces, as well as research organisations, industry, defence start-ups and innovators took part in the exhibition. The exhibition was a culmination of a four-year old process of initially introducing and subsequently leveraging AI and AI-based products in defence. The aim is to speed up decision-making processes, enhance cybersecurity, strengthen perimeter security, enable predictive maintenance and use  natural language processing (NLP) algorithms for on-the-spot translation for troops, especially when facing adversaries along disputed borders.2

    Ministry of Defence Initiatives on AI

    An AI Task Force was set up under the Ministry of Defence (MoD) in February 2018, which came out with recommendations in less than six months in June 2018.3 The report identified five areas for developing AI-based solutions for the Indian Armed Forces. These included lethal autonomous weapon systems (LAWS), unmanned surveillance, simulated wargames and training, cyber and aerospace security and intelligence and reconnaissance.4

    These recommendations were in addition to the proposals suggested by the Task Force on AI for India’s Economic Transformation of 19 January 2018, headed by Professor V. Kamakoti, which identified 10 domains where AI could be used.5 Pertaining to the sphere of national security, the four areas highlighted were related to autonomous surveillance and combat systems, adaptive communication systems, cyber-attack mitigation and counter-attack systems and multi-sensor data fusion based systems.

    Based on these inputs, the Defence AI Council (DAIC) headed by the Raksha Mantri and the Defence AI Project Agency (DAIPA) headed by the Secretary (Defence Production) were formed in February 2019.6 While the DAIC has been established as a policy-making body, DAIPA is responsible for implementing DAIC’s policy decisions and has been tasked to come out with pragmatic solutions, in collaboration with the Defence Research and Development Organisation (DRDO), academia and industry.7 An AI-based defence roadmap was formulated for the Defence Public Sector Undertakings (DPSUs) in August 2019, based on which 40 AI products were developed by March 2022.8

    In the AIDef symposium, out of 75 products, 15 were based on Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR), 10 on autonomous and unmanned robotic systems, 10 on intelligent monitoring systems, seven on manufacturing and maintenance, six each on process flow automation and NLP, four each on AI platform automation and perimeter security system, three each on internet of things (IoT) and operational data analytics, two on LAWS and one each on simulator/test equipment, logistics and supply chain management, block-chain based automation, cyber security and human behavioral analysis.9 Out of these, there were two projects based on Facial Recognition Technology (FRT), i.e., the iSentinel and the Silent Sentry system.10

    The iSentinel notes that its capabilities will include ‘historical tracking of people’, detecting ‘emotions, facial expressions and body language for patterns of argument, restlessness and sweating’ and ‘behaviour analysis’ for threat identification.11 The Silent Sentry boasts of both ‘human detection’ and ‘facial recognition’.12 Both these products arguably use an AI-based technology known as FRT which is based on the quantification of distinctive features (80, as per one study) of a human face such as distance between eyes, distance from the forehead to the chin, etc.13 This data is then compared with a database to decipher the identity of the individual recorded by a camera.

    Challenges of FRT in Counter-insurgency/Counter-terrorism Operations

    Though a case can be made for the use of FRT by the Indian Army in counter-insurgency (CI)/ counter-terrorism (CT) operations in certain areas such as Kashmir or parts of the North East, there are some challenges that need to be surmounted. The basic principle behind use of FRT in a CI/CT environment is to identify threats to either an Army camp or a company operating base (COB). The AI solutions currently being marketed are for static installations, and provide an early warning for  defence measures to get activated.

    The prerequisites of an effective FRT system are an exhaustive digital library or inventory of resident or terrorist facial data, excellent camera for capturing the images of individuals approaching the camp, secure and fast communications and strong processor for the mapping and matching algorithms to produce results in real time. Maintaining them in a CI/CT environment along with adequate power backup is a challenge, though these can still be taken care of.

    The challenges of using FRT software in a CI/CT scenario, though, go beyond the prerequisites of a solid data set and hardware/software. Firstly, a decision needs to be made regarding the kind of identification required. If the intention is to ‘negatively identify’, i.e., any individual not matching with the resident database may be deemed to be an alleged militant and liable to be handled as such, the database and processing requirement is formidable. Increased urbanisation and search for livelihood has resulted in migrations from rural to urban areas. There is an inherent population flux in the rural and semi-urban areas that needs to be accounted for in the picture library which cannot be limited to a particular area, but needs to be expanded to the entire Union Territory (UT) and maybe even beyond.

    If the intent is to ‘positively identify’, i.e., confirm the identity of a terrorist, militant or an over ground worker (OGW), an exhaustive database of such terrorists needs to be maintained. Such efforts may be hampered by the lack of updated photographs of such terrorists. The legal implications of creating and maintaining this database also need to be understood in detail by the Army. As of date, only two FRT projects are being planned to run in the UT of Jammu and Kashmir (J&K). These include a project by the J&K Police which has collaborated with the Srinagar Municipal Corporation to install FRT across Srinagar to weed out terror threats. The second is by the Housing and Urban Development Department for authenticating identities of the applicants for residence.14

    The use and maintenance of FRT across the country and by various state governments has also been challenged by activists in court. Add to this the challenges of poor visibility conditions, changes in surroundings, quality of photographs and finally the learning algorithm. False positives and false negatives are acceptable in any AI-based systems. However, in a CI/CT scenario, such outcomes may translate into a matter of life and death. FRT-based systems need to be test-bedded, vetted and analysed before deploying them in real-life conditions.

    Further, FRT broadly has two subsets when it comes to recognising an individual. The first is the ‘facial recognition’, which identifies the individual. The other is the ‘affect recognition’, which attempts to decipher the emotions and thereby, the intentions of the individual. Affect recognition, though used by a number of FRT firms around the world, is based on shaky scientific understanding and may not be accurate or even correlated with the actual emotion of the individual.15 Applying dubious scientific standards in a CI/CT scenario may lead to cases of mistaken identities. Cyber security measures for the security of the recorded data also need to be robust, lest the recorded data be  hacked, spoofed or changed.

    All these challenges are formidable and need a careful and even cautious approach to operationalising FRT. The Indian Army needs to consider these issues in detail before deploying AI-based systems in a CI/CT environment.

    Views expressed are of the author and do not necessarily reflect the views of the Manohar Parrrikar IDSA or of the Government of India.

    Top