SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33413350 of 4002 papers

TitleStatusHype
Semantic Similarity of Arabic Sentences with Word Embeddings0
Using Linked Disambiguated Distributional Networks for Word Sense Disambiguation0
Integrating Semantic Knowledge into Lexical Embeddings Based on Information Content MeasurementCode0
Literal or idiomatic? Identifying the reading of single occurrences of German multiword expressions using word embeddings0
Cross-Lingual Syntactically Informed Distributed Word Representations0
Multivariate Gaussian Document Representation from Word Embeddings for Text Categorization0
Modelling metaphor with attribute-based semantics0
Reranking Translation Candidates Produced by Several Bilingual Word Similarity Sources0
Using Word Embedding for Cross-Language Plagiarism Detection0
Ranking Convolutional Recurrent Neural Networks for Purchase Stage Identification on Imbalanced Twitter Data0
Show:102550
← PrevPage 335 of 401Next →

No leaderboard results yet.