A digital hello from us dear readers. After our last post on vaccine distribution in India and the digital divide, there has been a flood of coverage of the subject. The BBC deals with it here in passing. NDTV devoted a show to it:
The Hindustan Times found that "doses administered in urban districts were nearly 1.7 times those in rural districts as on May 13.". Their report is here.
Al Jazeera wrote "Online registration for the jab bars up to half of India’s population, mainly in poor and rural areas, who do not have smartphones or internet access."
ET Telecom has a report with the headline "Low smartphone reach coupled with lack of digital literacy hit rural India Covid vaccine drive".
We at the Digital Empowerment Foundation welcome this attention being paid to the digital divide by the mainstream media, belated as it may be.
Oh by the way, we are proud to share with you the 'Digiज्ञान' program! Under the program, DEF had setup one digital classroom in Government Senior Secondary School at . The aim is to bridge the digital divide amongst students of various socio-economic and other geographical barriers and empower rural children by education through digital literacy. So far, 200 school students have been successfully registered.
'Digital' means one thing for the have-access and another for the have-nots-exclusion. The haves ask- will food delivery via 'apps' continue during the lockdown? This query is amplified, echoed and answered many times. No one pays too much attention to the have-nots, in this context, gig workers, the ones doing the food delivery. Entrackr published a story on the 13th of this month, bringing some focus on these workers.
As you can expect, dear readers, once again we had delved deep into the issue well in time.
In case this interests you, you may also like to read the 2020 edition of Global Information Society Watch (GISWatch), which published Digital Empowerment Foundation's country research report titled 'Economy on the Margins: Risks and Exclusion of Informal Sector E-waste Recyclers in Policy and Practice'. The report has tried to enquire about the need to create conversation around e-waste management policies that recognize the nuanced role of informal sector workers. The report has aimed to recognize the socio-economic realities of informal sector e-waste workers to work towards creating an inclusive policy space.
The complete report is here.
This week no conversation about the unempowered can be complete without mentioning Israel's attack on Palestine. No global event of any consequence happens in a vaccum away from 'digital'. Big Tech per usual continues to silence the oppressed.
An opinion piece in Al Jazeera by Omar Zahzah notes:
"In April, after Zoom refused to host the “Whose Narratives?” event for the second time – following pressure from an Israeli government app and several right-wing Zionist organisations – Facebook not only took down publicity posts about the event, but also deleted the page of the AMED Studies program from its platform in its entirety, effectively erasing a vast archive of talks, discussions and documents on the Palestinian liberation struggle and its relationship to freedom movements from around the world. These materials were being intentionally shared and stored on Facebook for academics, activists, organisers and the community at large to be able to engage with them free of charge and without restriction.
Coming on the heels of Zoom’s repeated attempts to arbitrate what is and is not acceptable speech in academia, Facebook’s deletion of the AMED page made clear Big Tech’s modus operandi when it comes to Israel-Palestine: censor material related to the Palestinian struggle on Israel’s demand, and ignore any criticism of these unlawful and unjust actions.
Israel and its allies are not only pressuring Big Tech to silence the Palestinians from outside. Facebook’s oversight board, an independent body tasked with deliberating on the platform’s content decisions, includes former director-general of the Israeli ministry of justice, Emi Palmor. Palmor personally managed Israel’s Cyber Unit in the past, which successfully lobbied for the removal of thousands of pieces of Palestinian content from Facebook."
Nothing new here, of course. This, from 2016:
There is a pattern to Big Tech's censorship, whether it is in USA, Myanmar, Sri Lanka, India or other nations. Big Tech sides with the bully. It censors first and offers absurd explanations later, if at all.
All these powers of censorship vanish into thin air when they are needed to protect the vulnerable. In this case, men 'auctioning' women on YouTube.
Big tech is notorious for being fertile ground for hate- whether on the basis of gender, caste or religion.
Meanwhile, WhatsApp has once again extended its 'deadline' for forcing users to accept terms which violate their privacy.
We strongly recommend that you go through the chart below and make an informed decision about the messenger application you would like to use.
This week at the Delhi High Court, WhatApp claimed:
Is it really as simple as that? No. One reason-in summary, WhatsApp has a dominant position in, among others, the Indian 'market', with millions of devices having only WhatsApp pre-installed and so on. So people can't 'choose'. If IRCTC and Aarogya Setu have the same terms, that is grounds for their terms to also be changed. The government of India is opposing WhatsApp's policy in court. The Internet Freedom Foundation has a good explainer about the problems with WhatsApp's policy here:
There is a lot of talk about 'positivity' these days. We have some positive news to share.
As a part of COVID-19 ground report series, DEF documented people from rural India who mastered the art of using smartphones to become digitally literate and skilled community journalists; learning English to become cricket match commentators; sharing folk music on YouTube; and using WhatsApp to find buyers for their handloom and handicraft materials without being comprehensively literate, educated, schooled or qualified.
The complete report is here.
Of course there is more, we work hard.
Digital Empowerment Foundation in collaboration with USAID and DAI is implementing a project named “Strengthening and Building Resilience of Women Entrepreneurs (WEs) and Woman led Community Development Organisation (CDOs) through Digital Up-skilling in India” also known as 'Digital Sarthak'. For this purpose, a survey was conducted across seven states and 10 districts namely, Assam (Dist. Nagaon, Hailakandi), Uttar Pradesh (Dist. Barabanki), Jharkhand (Dist. Khunti, Ramgarh), Bihar (dist. West Champaran), Madhya Pradesh (Dist. Guna), Rajasthan (Dist. Alwar, Barmer) and Haryana (Dist. Nuh). The report attempts to understand the current patterns of usage of digital tools amongst women entrepreneurs, their familiarity with financial tool and overall financial literacy, their existing business strategy and lastly their economic situation post-lockdown.
The complete report is here.
Finally, our content recommendation. This week, we watched the documentary 'Coded Bias' which is available on Netflix. Tarun Pratap, our in-house Netflix devourer and senior research associate, has this to say about the docu:
The documentary follows MIT media lab researcher Joy Buolamwini who realized while working on facial-recognition software that its algorithm did not recognize her face until she put on a white mask. This led to collection of data which proved facial-recognition only recognized a white man’s face in almost fully effective degree followed by white female face, coloured male face and lastly coloured female face. This prompted her to look into this more. It was clear that the bias was in the system because system was designed by the engineers largely filled by white males. The mathematical equation of the algorithm which was being used by the companies like Amazon to hire people was rejecting people on the basis of colour and even gender and was not recognizing the people who were using different gender expression at all- a system clearly marked against the minorities of the society. So, how did this system really come about? It came about because of the mathematical data filled in the creation of this AI. The problem is not whether the AI is racist or not, the problem is that AI was clearly replicating the world it was receiving information and knowledge from- a world that is largely sexist and racist. AI was just repeating it through mathematical equations. For example, in America majority of prisoners are people of colour and not white people for various reaons including poverty and state-bias. The AI sees this math and considers it a linear invetiable outcome where it thinks mathematically a person of colour has far more chances of being indulged in criminal activities based on existing ratio. The fact that not only is this racist but it is against the basic human rights in all democracies where a person is considered innocent until proven guilty.
The documentary further explores how America lacks a central data protection law which has given rights to corporates like Facebook, Amazon etc. to use technology like facial-recognition and other personal data to use it for their financial advantages. As expressed now, it is humans who are commodities and are being sold to other companies along with their data by the companies who control the data leading to creation of a society of surveillance. For years, writers and filmmakers have been fascinated with the concept of a dystopian society but in all of those stories it was always imagined that it will be state surveillance but in reality, it is corporate surveillance. Facebook has 2.6 billion users which is way more than the country with the largest population in the world meaning, in a way, that Facebook is the largest populated entity. For now, there is not enough evidence but it is easily possible that the data can be used in a political manner to swing the elections in the favour of the largest bidder by these companies.
Coded Bias is a commentary in the context of time that we live in. It talks about how AI as technology is dangerous until the users are filled with subconscious racial and gender biases. I personally have been stopped by policemen at metro stations for extra checks for no apparent reason except for looking a certain way which was a result of subconscious bias of those men. Now AI is doing the same and perhaps more vehemently because the racial and gender bias that is perpetrated is based on "mathematical calculation and estimates", however flawed that math may be.
That's all folks. Until next time.
TypeRight - The Digital NukkadFollow to receive updates for new posts