How a teacher and her students are bringing queer equality to Wikipedia
Briefly

How a teacher and her students are bringing queer equality to Wikipedia
"Right now, so much of the information generated by AI is sourced from Wikipedia and therefore includes the biases that already exist - biases that overlook the contributions of women, people of color, sexual minorities, and other marginalized groups. Therefore, in a small way, Wikipedia pages that specifically highlight the histories and contributions of queer and trans people of color help counteract that bias, while also frequently highlighting the research of scholars of color and others who study these topics."
"Her students have made an impact. They have added more than 300,000 words and 3,500 references, edited 588 articles, and written 63 articles, Rodríguez says, noting that the project has garnered 96 million views."
"Right now, fewer than 20 percent of Wikipedia biographies are of women, despite Wikipedia being the largest digital encyclopedia ever created. These additions are also important because AI technologies learn and extract information from sites like Wikipedia. It is important to include fact-checked information from excluded communities, so that, in turn, these sites can train AI models as accurately as possible on excluded and marginalized communities."
UC Berkeley professor Juana María Rodríguez leads a 10-year Wikipedia project with students to document LGBTQ+ history and Latine women's contributions. The initiative has added over 300,000 words, 3,500 references, edited 588 articles, and written 63 new articles, generating 96 million views. This work addresses the significant underrepresentation of women in Wikipedia biographies, which comprise fewer than 20 percent despite Wikipedia being the world's largest digital encyclopedia. The project holds particular urgency given current attacks on LGBTQ+ history through legislation and book bans. Additionally, the work addresses AI bias, as artificial intelligence systems learn from Wikipedia. Including fact-checked information about marginalized communities helps train AI models more accurately and counteracts existing biases that overlook contributions from women, people of color, and sexual minorities.
Read at Advocate.com
Unable to calculate read time
[
|
]