LaBSE, a new language-agnostic embedding model, supports 109 languages with SOTA accuracy August 19, 2020 by admin Tweet Language-agnostic BERT Sentence Embedding (LaBSE), is a multilingual BERT embedding model that supports language-agnostic cross-lingual sentence embeddings for 109 languages by combining MLM and TLM. Read more… Neowin Related Posts:ElevenLabs unveils text-to-speech Turbo 2.5 model…Google Translate can now translate 110 new languagesSam Bankman-Fried set to receive jail sentence in…Reddit now lets you translate posts into 8 languagesReport: Indian government building a ChatGPT-powered…