SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20312040 of 4002 papers

TitleStatusHype
Query Expansion with Locally-Trained Word Embeddings0
Querying Word Embeddings for Similarity and Relatedness0
Query Obfuscation by Semantic Decomposition0
Query Translation for Cross-Language Information Retrieval using Multilingual Word Clusters0
Question Answering over Freebase with Multi-Column Convolutional Neural Networks0
Questioning Arbitrariness in Language: a Data-Driven Study of Conventional Iconicity0
Question Type Classification Methods Comparison0
Raccoons at SemEval-2022 Task 11: Leveraging Concatenated Word Embeddings for Named Entity Recognition0
Racial Bias Trends in the Text of US Legal Opinions0
Random Decision Syntax Trees at SemEval-2018 Task 3: LSTMs and Sentiment Scores for Irony Detection0
Show:102550
← PrevPage 204 of 401Next →

No leaderboard results yet.