Facebook’s AI has introduced an open-sourced Blender, which is the largest open-domain chatbot, if the Facebook blog is anything to go by. This chatbot has been put to the grind and it has been trained on 9.4 billion parameters – which is nearly 4 times as many as Google’s Meena.
Facebook also informed that Blender will be the first chatbot which will be able to build a diverse set of conversational skills and this includes empathy, knowledge and personality all collated into one single system. The blog also cited a number of human evaluators and said that the engagement felt a lot more human.
“Our neural networks are too large to fit on a single device, so we utilized techniques such as column-wise model parallelism, which allows us to split the neural network into smaller, more manageable pieces while maintaining maximum efficiency,” Facebook stated.
“There are a lot of sophisticated techniques that you have to use in terms of how you chop this thing up. If you split it over different devices and chop it up like the wrong way, then you’re going to lose that efficiency that you have and you’re not going to be able to scale to these terabyte-sized data sets that we’ve been working with,” Stephen Roller, a research engineer at Facebook’s AI lab (FAIR) was as quoted by Engadget.
The social media giant also said that they have introduced Blended Skill Talk (BST) which will help in training and evaluation of specific skills which include use of personality, knowledge, and display of empathy.
“To make sure conversational agents don’t repeat themselves or display other shortcomings, researchers typically use a number of possible generation strategies after the model is trained,” Facebook said.