SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 38613870 of 4002 papers

TitleStatusHype
Are you tough enough? Framework for Robustness Validation of Machine Comprehension SystemsCode0
Are We Consistently Biased? Multidimensional Analysis of Biases in Distributional Word VectorsCode0
Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase GenerationCode0
Caveats of Measuring Semantic Change of Cognates and Borrowings using Multilingual Word EmbeddingsCode0
Incorporating Syntactic and Semantic Information in Word Embeddings using Graph Convolutional NetworksCode0
Causally Denoise Word Embeddings Using Half-Sibling RegressionCode0
Categorical Metadata Representation for Customized Text ClassificationCode0
MGAD: Multilingual Generation of Analogy DatasetsCode0
Query-by-Example Search with Discriminative Neural Acoustic Word EmbeddingsCode0
Graph-of-Tweets: A Graph Merging Approach to Sub-event IdentificationCode0
Show:102550
← PrevPage 387 of 401Next →

No leaderboard results yet.