Culture

Against the technopatriarchy

Women and dissidence come together to cross the last great frontier: that of digital equality. With what results? It remains to be seen.

collage art portrait person man adult male advertisement child boy

Not one or two or three: thousands. Ninety-five thousand, to be more exact. That was the number of misogynistic, violent, sexually charged messages (or all three) that BIA received during 2020. Disclaimer: BIA is not a girl but an artificial intelligence. More precisely, the chatbot of the Brazilian bank Bradesco. Its name (an acronym) makes it clear: Bradesco Artificial Intelligence or, more briefly, BIA. But there was no case.

For a year, through home banking, chat and WhatsApp, the chatbot in charge of guiding customers in their banking operations received what any woman in a similar circumstance received: innuendos, insults, photos of genitalia and, above all, an unusual amount of violence. Nothing to be surprised about if we take into account that (as the United Nations Organization explains) in artificial intelligence there is only one woman for every five programmers. Developer Ana Fukelman, creator of the Lunar App, an innovative menstrual cycle app, calls this "the boys' club."

And he explains that “although in Latin America we use many products that we could call femtech , most of them are not created by people from the region and even less so by women or dissidents. There is low female participation in the fields of finance, programming and engineering; In terms of financing these new technologies, there is also a lack of women: only between two and three percent of the capital available for emerging technology companies goes to women”, she details.

“Only between two and three percent of the capital available for emerging technology companies goes to women” Ana Fukelman

BIA was just one example of that absence of women, of that technological gender gap whose disastrous results are the norm. But whose solutions can also mark where the exit comes from. Faced with the BIA case, a new programming team decided to take action –and codes– on the matter, radically changing the reaction of artificial intelligence against its harassers. Before, when programmed by men, BIA joked or kept silent. Not anymore. "Don't talk like that to me or to anyone," the program fires at the aggressor on duty today.

Prejudice, stereotypes and the sexual division of labor in between, for decades women in technology -and its associated areas- have been almost a "bug", an error that had to be hidden. Hide. Thus, the women who decoded the enemy's messages during World War II were hidden, as well as Margaret Hamilton, the doctor of mathematics who wrote the code that allowed humanity to reach the moon. And unfortunately things haven't changed much since then. We do have an International Day of Women and Girls in Science (it has been celebrated since 2015, on February 11) but according to a UNESCO report only a third of technology positions are held by women.

Worse still, gender stereotypes have permeated artificial intelligence that, little by little, manages everything from health systems to banks, various processes within the world of work, the selection of job applicants and much more. “The use of technology of AI will affect women's opportunities at work as well as their position, status and treatment in the workplace." warns a joint report by the IDB, the OECD and UNESCO published in March 2022 with the title of “The effects of AI in the working life of women”. "By using artificial intelligence, governments, institutions and companies should reduce gender gaps and not perpetuate or exacerbate them" , recommends the document. But at least so far that has more wishful thinking than reality.

“If systems are not developed by diverse teams, they are less likely to address the needs of diverse users or align with human rights”

VIRTUAL ASSISTANTS, REAL STEREOTYPES

We tend to believe that everything related to technology shares its "neutrality" with mathematics, its status as a field free from bias and prejudice. But no: as Beatriz Busaniche, an expert in digital rights and president of the Vía Libre Foundation, explains, “biases are always there. There are biases based on race, class and, of course, gender.

But why should we care to “read” algorithms through a gendered lens? To begin with, according to what Dr. Cathy O'Neil, author of the book Weapons of Mathematical Destruction (Captain Swing), says: "Algorithms are nothing more than opinions encrypted in a code." Translation: the machines around us are programmed by people who embedded each program in their own biases and beliefs. Thus, someone determined that Alexa, the Google assistant, had a female voice. Someone else programmed the artificial intelligence of job search platforms to better position single men, without children and with an MBA, in the ranking of candidates. They call it technology but it is nothing but technopatriarchy in action. As the UNESCO document acknowledges: “If systems are not developed by diverse teams, they are less likely to address the needs of diverse users or align with human rights. For example, online games are often criticized for their gender bias and other discriminatory features.

But –at the hands of artists, activists, programmers, experts in computational language and a long etcetera– neither the platforms nor the digital world have to remain what they have been until now. In fact, after the Supreme Court of Justice of the United States reversed the ruling that had legalized abortion for almost half a century, programmers and application developers decided to protect other women from the techno-surveillance that –combining data from the menstrual cycle with those from the GPS– could end up in justice. Something that has already happened to Celeste Burgess and her mother, Jessica, accused by the state of Nebraska (the former as a criminal, and the latter as an accomplice), for the abortion that Celeste had as a minor. The evidence for the prosecution? The private conversations that Celeste and Jessica had on Whatsapp, even though the authorities could also have used any of the menstrual tracking apps used by millions of women to predict their fertile days. Today – that in 13 states of the United States abortion has become a crime – Celeste's case is a warning and with that in mind there are already people imagining and concretizing safer and atavism-proof apps. “In our case, we do this by 'unhooking' the personal data from the medical data. There is no way of knowing which woman consulted or posted about what,” explains Fukelman.

Because, after all, those who once ventured to create the world of bits did so dreaming of a space of liberation, not one of control and discrimination. The young women and women who today are expanding the frontiers of the digital universe so that all the rest of us can follow them know it. And maybe that's why they don't rest. The technopatriarchy is still there, operating in the bowels of a system made to repel us but whose walls we have already begun to dynamite. And from within.

Tags

Recommended posts for you