LaBSE, a new language-agnostic embedding model, supports 109 languages with SOTA accuracy August 19, 2020 by admin Tweet Language-agnostic BERT Sentence Embedding (LaBSE), is a multilingual BERT embedding model that supports language-agnostic cross-lingual sentence embeddings for 109 languages by combining MLM and TLM. Read more… Neowin Related Posts:No Language Left Behind: Meta's new AI model can…ElevenLabs unveils text-to-speech Turbo 2.5 model…Save 37% off this NEWYES Scan Reader Pen 3…Google Translate can now translate 110 new languagesSam Bankman-Fried set to receive jail sentence in…