SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30013010 of 4002 papers

TitleStatusHype
Neural Machine Translation between Myanmar (Burmese) and Rakhine (Arakanese)0
Neural Machine Translation for Tamil–Telugu Pair0
Neural Machine Translation from Historical Japanese to Contemporary Japanese Using Diachronically Domain-Adapted Word Embeddings0
Neural Machine Translation of Logographic Language Using Sub-character Level Information0
Neural Metaphor Detection with a Residual biLSTM-CRF Model0
Neural Morphological Tagging from Characters for Morphologically Rich Languages0
Neural Natural Language Processing for Unstructured Data in Electronic Health Records: a Review0
Neural Networks and Spelling Features for Native Language Identification0
Neural Networks for Cross-lingual Negation Scope Detection0
Neural Networks For Negation Scope Detection0
Show:102550
← PrevPage 301 of 401Next →

No leaderboard results yet.