Artificial intelligencefromKotaku1 month agoWikipedia Won't Add AI-Generated Slop After Editors Yelled At ThemWikimedia Foundation canceled AI-generated summaries due to strong backlash from its community of human editors.
fromThe Verge2 months agoWikipedia fights the UK's 'flawed' and 'burdensome' online safety rules"As a Category 1 service, Wikipedia could face the most burdensome compliance obligations, which were designed to tackle some of the UK's riskiest websites."Privacy professionals
Artificial intelligencefromThe Verge3 months agoWikipedia is giving AI developers its data to fend off bot scrapersWikimedia is releasing a dataset for AI training to curb Wikipedia scraping.Partnership with Kaggle aims to provide structured data for AI model training.The dataset includes well-structured information while reducing server load from AI bots.
Artificial intelligencefromTechzine Global3 months agoWikimedia is dealing with a 50 percent increase in bandwidth due to AI crawlersWikipedia's bandwidth usage surged 50%, mostly due to AI crawlers, not human users.AI scraping creates challenges for Wikimedia, slowing access and increasing costs.
Artificial intelligencefromTheregister3 months agoWikimedia Foundation bemoans AI bot bandwidth burdenWeb-scraping bots are straining Wikimedia's resources, increasing bandwidth usage by 50% since January 2024, heavily impacting project sustainability.
Privacy technologiesfromArs Technica3 months agoAI bots strain Wikimedia as bandwidth surges 50%AI crawlers are circumventing established rules, creating challenges for content platforms.Wikimedia is focusing on a systemic initiative to address scraping issues and protect its infrastructure.
Artificial intelligencefromTheregister3 months agoWikimedia Foundation bemoans AI bot bandwidth burdenWeb-scraping bots are straining Wikimedia's resources, increasing bandwidth usage by 50% since January 2024, heavily impacting project sustainability.
Privacy technologiesfromArs Technica3 months agoAI bots strain Wikimedia as bandwidth surges 50%AI crawlers are circumventing established rules, creating challenges for content platforms.Wikimedia is focusing on a systemic initiative to address scraping issues and protect its infrastructure.
Artificial intelligencefromTechCrunch3 months agoAI crawlers cause Wikimedia Commons bandwidth demands to surge 50% | TechCrunchAutomated scrapers are responsible for a 50% increase in multimedia bandwidth usage on Wikimedia Commons.
OMG sciencefromEngadget3 months agoWikipedia is struggling with voracious AI bot crawlersAI crawlers are causing a 50% increase in Wikimedia's bandwidth, threatening user access to content.