Video Player is loading.

Up next


India plans mass surveillance through facial recognition

Conscious Mind's
Conscious Mind's - 240 Views
718
240 Views
Published on 15 Sep 2020 / In Film and Animation

The dystopia of mass surveillance in George Orwell’s novel 1984 seems to be closer than ever before. This has mainly happened because of the advancement of artificial intelligence technologies like Facial Recognition Technology or FRT. FRT systems are technologies of the future and these systems are already in place and are being used by governments across the world for security purposes. China does not want to hear anything about democracy and therefore China has installed the largest centralised FRT system in the world. There are over 200 million closed-circuit television cameras or CCTVs in the country from which data can be collected and analysed. There are also 20 million specialised FRT cameras which collect data continuously for analysis. China is currently using these systems to surveil ethnic Uyghur minorities in the re-education camps it has set up in the Xinjiang region. Their behaviour is being manipulated with the use of continuous surveillance. China also used FRT systems to profile protestors during the pro-democracy protests in Hong Kong. These steps have raised worldwide concerns about cultural bias which can become inherent in these systems. It has also put into question a person’s right to freedom of expression, privacy and basic dignity. But how does an FRT system works?. It can be as simple as the face identification in a smartphone which works as a password and verifies a user’s face to unlock the phone. Other everyday examples are the myriad face modification filters that are a part of apps like FaceApp, Instagram and Snapchat. As the complexity of the purpose of an FRT increase so does the complexity of the analysing system and its intelligence. The most prevalent FRT models create a mathematical representation of a person’s face which is then compared to the representations captured in an existing database. But where does India stand in all this? India is not far behind. It is building one of the largest centralised FRT surveillance systems in the world. And this system has been called the National Automated Facial Recognition System or in AFRS . In 2019 the National Crime Records Bureau or NCRB under the Ministry of Home Affairs issued a request for proposal or RFP for companies which would be interested in creating the AFRS. This RFP has undergone multiple changes in the last one year owing to push back from civil society organisations, mainly because there is no law governing facial recognition technologies systems in the country and the current systems are widely inaccurate. The deadline was also extended many times, twice in just the last month. The current deadline is on September 17. Currently this FRT system is being used by police forces in seven Indian states. But this system is very basic and rudimentary, unlike what is being China. They are also being used for the purpose of identification by different civil departments across the country. For instance the ministry of civil aviation started the Digi Yatra Programme on a trial basis at some airports like Hyderabad (July 2018), Delhi (September 2018) and Bengaluru (December 2018). The programme allowed passengers to check in using facial recognition as the boarding pass making the process paperless. But the major problem with FRTs in India, because they are so rudimentary is that they are not accurate which could lead to misidentification of people in turn leading to false accusations and arrests. In 2018 the Delhi Police had reported that their trial FRT system had an accuracy of only 2 percent. This 2% was admitted in an affidavit filed by the Delhi Police in the Delhi High Court. What is worse is that in 2019, the ministry of women and child development reported that the accuracy of the current FRT systems in India was only 1%. The ministry reported that it could not even distinguish between boys and girls. Despite the poor track record, india continued to use FRT in 2020. According to the Union home minister Amit Shah, over 1900 faces have been identified through facial recognition software for inciting violence in the national capital during the Delhi riots at the end of February. Shah said that driving licence and voter ID card information was used for carrying out the identification.
Even if an FRT system is accurately implemented there will be function creep. Function creep occurs when a technology or system gradually widens its scope from its original purpose to encompass and fulfil wider functions. For instance, the Delhi Police used FRT to track down people present during the protests against Anti Citizenship Amendment Act (CAA) and National Registry of Citizens (NRC) that occurred from December 2019 to March 2020. In the absence of a national or even state policy governing artificial intelligence systems like FRT function creep raises grave concerns about people’s right to freedom of expression and their right to privacy.

Show more
0 Comments sort Sort by

Up next