The robot, called BlenderBot 3, is accessible on the Internet. For the moment, residents of Canada do not yet have access to it.
According to Meta, the software can maintain a conversation with Internet users, but also offer them help, whether to find a recipe or a cultural activity in their city.
BlenderBot 3 ingested massive amounts of words and information in order to consistently respond to user requests.
Taming BlenderBot 3
With the help of the public, the tech giant now wants to make sure it minimizes bias and the use of vulgar or culturally insensitive language in its software, well-known problems in the world of artificial intelligence.
Internet users who wish to test the capabilities of BlenderBot 3 are therefore invited to consent to their conversations being analyzed by Meta. Users may also report strange or problematic responses provided by the software.
In exchange, the technology giant undertakes to publicly disclose all the data collected as part of its trials in order to contribute to the progress of chat robots.
Tay’s short life
Meta isn’t the first company to release messaging software of this type.
In 2016, Microsoft launched Tay on Twitter. He had to learn from his interactions with the public. Quickly, Internet users taught him racist and sexist phrases, which forced the company to end the experiment less than 24 hours later.