Since the whole computer processor thing isn’t really working out anymore, Intel’s pivoting into the more important business of automatically censoring hate speech in video game voice chats — RIP to your heated gamer moments.
Intel showed off its new ‘Bleep’ software, developed in partnership with Spirit AI, in a 40-minute virtual presentation as part of GDC 2021. Bleep uses artificial intelligence (AI) to “bleep” out hate speed in real time — thus the name. Bleep is currently in beta after a prototype was developed two years ago, and Intel showed some screenshots of the software’s settings in the presentation.
Unsurprisingly, the internet responded by roasting Intel and turning Bleep into a meme.
Part of the issue was that Intel’s screenshot of choice showed off Bleep’s filter settings, which let users select whether they wanted ‘None,’ ‘Some,’ ‘Most’ or ‘All’ of various categories of hate speed, including misogyny, name-calling, use of racial slurs, racism and xenophobia, sexually explicit language and swearing.
Many online folks found the notion that people would be fine with some, but not a lot of hate speech, to be funny. That lead to plenty of funny tweets mocking Bleep:
https://twitter.com/beesmygod_/status/1379940621977325570?s=20
https://twitter.com/Strange_Sunset/status/1379954282703638532?s=20
Intel quickly moved to defend its new software. Speaking to Polygon, Intel’s general manager of gaming, Marcus Kennedy, said that “the intent of [Bleep] has always been to put that nuanced control in the hands [of] users.” As an example, some people may be okay with a certain level of ‘shit talk’ with friends, but not with a stranger. Kennedy also suggested that one screenshot wasn’t enough to capture the experience of using the product.
Further, Intel stressed that the Bleep software wasn’t final and it could change before release.
While those points are fair, Bleep likely isn’t the best solution for what’s frankly a rampant problem in online gaming spaces (and online spaces in general). It would be helpful for some people, but it really just sweeps the issue under the rug instead of actually fixing it. And, to be fair, Intel seems to recognize this despite spending two years working on Bleep. That Intel even thought it should develop the software shows how bad online toxicity has become.
As fun as it is to mock Intel and Bleep, the real joke it that we allowed hate speech and bigotry to freely propagate in online spaces to the point that this software became necessary.
Via: Polygon
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.