SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28012810 of 4002 papers

TitleStatusHype
Literal or idiomatic? Identifying the reading of single occurrences of German multiword expressions using word embeddings0
LIT Team's System Description for Japanese-Chinese Machine Translation Task in IWSLT 20200
LMU Bilingual Dictionary Induction System with Word Surface Similarity Scores for BUCC 20200
LNMap: Departures from Isomorphic Assumption in Bilingual Lexicon Induction Through Non-Linear Mapping in Latent Space0
Local-Global Vectors to Improve Unigram Terminology Extraction0
Local Homology of Word Embeddings0
Locality Preserving Sentence Encoding0
Locally-Contextual Nonlinear CRFs for Sequence Labeling0
Local Topology Measures of Contextual Language Model Latent Spaces With Applications to Dialogue Term Extraction0
Looking for a Role for Word Embeddings in Eye-Tracking Features Prediction: Does Semantic Similarity Help?0
Show:102550
← PrevPage 281 of 401Next →

No leaderboard results yet.