This is one of the narratives of the communities impacted from data injustice which are published in the research report on data justice by Digital Empowerment Foundation
The injustices that algorithms of platform and gig-economy apps cause has been documented previously. In India, the workers in the gig-economy are counted as “clients,” depriving them of many protections labour laws provide. In such an unorganised sector, Shaik Salauddin of the Indian Federation Of App Based Transport Workers (IFAT) is one of the leaders organising and unionising people working in ride-hailing and delivery apps. We speak to him in detail about the algorithms that cause injustices.
In December 2019, the Indian Parliament passed the controversial Citizenship Amendment Bill, along with the government’s commitment to enforce a National Register of Citizenship. As Booker Prize winning author and activist Arundahti Roy put it, “Coupled with the Citizenship Amendment Bill, the National Register of Citizenship is India’s version of Germany’s 1935 Nuremberg Laws, by which German citizenship was restricted to only those who had been granted citizenship papers—legacy papers—by the government of the Third Reich. The amendment against Muslims is the first such amendment.” Noting the use of an automated tool to decide the lineage of people in Assam, we spoke to Abdul Kalam Azad , a researcher from Assam, now at Vrije Universiteit Amsterdam, who had looked into detail the issues and exclusions created by the NRC in Assam. Learning of exclusions of Trans People from the same list, (already facing an undemocratic law like the Trans Act), we spoke to two activists from the Trans Community, Sai Bourothu, who had worked with the Queer Incarceration Project and the Automated Decision Research team of The Campaign to Stop Killer Robots, and Karthik Bittu, a professor of Neuroscience at Ashoka University, Delhi and an activist who had worked with the Telangana Hijra, Intersex and Transgender Samiti.
Another exclusion we noted in our primary research was the homeless in any of the data enumerations. We spoke to Jatin Sharma and Gufran, who is part of the Homeless Shelter in Yamuna Ghat on these exclusions and how it leads to the homeless people being denied basic healthcare and life-saving TB treatment.
Four researchers, activists and civil society leaders who had done considerable work on data related exclusions, surveillance, and identification software such as the Aadhar offered their perspectives on the debates, conversations and potential reimaginings of data injustices. Srinivas Kodali, independent activist and researcher; Nikhil Dey, of the Mazdoor Kisan Shakti Sangathan; Apar Gupta, lawyer and director of the Internet Freedom Foundation, and Rakshita Swamy, an NLU professor who also heads the Social Accountability Forum for Action and Research were the people who provided their insights.
At the Margins of Urban and Data: Homeless Population
From the interaction on the homeless shelters in New Delhi, the depth of a digitalised system of governance and health was revealed. Almost every health service, from following up on Tuberculosis (TB) Treatment (India is the highest in TB incidence statistics, with over 2.64 million cases), accessing vaccination, or even simply getting admitted to the hospital requires one to have identification like the Aadhaar, and at times even a mobile phone where verification OTPs are sent. A health scheme named Nikshay 14 was designed by the government to cater to the nutritional requirements of recovering TB patients. As per the scheme a sum of Rs. 500 (~$6.5) is transferred to the bank account of TB patients under treatment. Despite the high occurrence of TB in the homeless population, many of them can’t avail of this scheme due to the lack of a Nikshay ID and bank therefore no IDs. The majority of them do not have a mobile phone as keeping them safe is difficult. In this case, they are dependent on the shelter staff for all OTP-basedID authentication systems. This is further complicated by their status as migrant workers who travel from one place to another and can’t come back to a single shelter to avail of any entitlement service.
“There’s almost 50-60 percent difference between the enumeration done by the civil society groups and the state census. Because the police are the ones who are doing the data collection.” “The issue here is that with technological interventions, there is this assumption going around that you’ll get a holier truth now, you’ll get an unquestionable truth of some kind through the technology without the realisation that the theory laden ness of the data that technology will gather, that is coming from the human values. And what kind of human values are encoded in that algorithm will determine what kind of data you’re collecting, and so exclusion is happening at that level. Now technology or no technology, you’ll have outliers and exclusions. But once this is institutionally acknowledged, and recognized that the problem lies here, there’ll at least be scope for challenging, negotiating and so on.”
Another researcher on data policy, Srinivas Kodali, who had worked on Aadhaar and disenfranchisement mentions the inevitability of it. Data is being collected, and it has as good as become an inevitability. But when the state says they are collecting health data of citizens, they mean it only from an economic point of view that benefits a very few such that you can be sold products. Will the health data collected find out that platform workers have declining health, or are susceptible to accidents because of the incentive goals and therefore needed to be covered under benefits? This is another analysis that can come off from the data, depending on how it is looked at. “I don’t think [the workers] are saying that we don’t collect our data. Do actually collect it, but you’re not doing that justice part,” says Srinivas. “I’m just hoping that the AI revolution that the Indian government speaks about actually does take note of the ground workers on the street,” he says, pointing towards an inclusive use of data. In the smart city mission the government had launched, a lot of the money that was given out was for data collection- building dashboards of data of cities. What gets counted in these is another question of exclusion. Slums are part of the data so much that funding drives the number of slums counted. Homeless or slum populations are counted as nuisances to traffic or the gentrified city. A comparison is when AI detected people without masks and fined them, instead of spreading or supplying masks. We “essentially have to give control to the people, you have to let them access these datasets, use it for their own good but what actually is happening is that the government has taken control of people’s data and if we’ve seen all the national programmes or data collection programmes of the government, they are essentially claiming our bodies as government property.” But it matters how the government is identifying or classifying these sections of people: “[we] don’t know how the government is going to classify you. When we talk about the identity petitioners who were in front of the supreme court, saying we don’t want to be identified as these kinds of people forever. We don’t want to be identified as slum dwellers, we don’t want to be identified as some kind of different people, we don’t want that stamp to exist.” This, he says, is the kind of profiling that Aadhaar was going to create – and this was known. There was no participation in the project, no means to ensure equity or question existing power structures. India’s UID does not ensure there is no harm. The amount of data collected without transparency is disempowering the people, perhaps restricting them from accessing finance because of the categorisation that happens with data. Srinivas agrees how the government claims it has opened up data with new policies like Open Data 15. But will that qualify as data justice? The issue, he says, “is that without the knowledge, without further internal data sets that the government possesses, you can’t use these data sets. You don’t have the computational infrastructure, you don’t have the skill set of people to do anything with it. So data justice, without actually giving out the funds, giving out the infrastructure, giving out the manpower, is not going to be enough.” When it is just people with access that gains, those gains are asymmetry.