LaBSE, a new language-agnostic embedding model, supports 109 languages with SOTA accuracy August 19, 2020 by admin Tweet Language-agnostic BERT Sentence Embedding (LaBSE), is a multilingual BERT embedding model that supports language-agnostic cross-lingual sentence embeddings for 109 languages by combining MLM and TLM. Read more… Neowin Related Posts:No Language Left Behind: Meta's new AI model can…Two new dialects of Inuktut language now available…Microsoft announces Turing Bletchley v3…Mark Zuckerberg says Meta is releasing LLaMA AI…Brave Search now has answers to your questions using…