SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32613270 of 4002 papers

TitleStatusHype
Deriving Disinformation Insights from Geolocalized Twitter CalloutsCode0
Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention SelectionCode0
Named Entity Recognition in the Romanian Legal DomainCode0
Design and Implementation of a Quantum Kernel for Natural Language ProcessingCode0
Intrinsic Probing through Dimension SelectionCode0
Introducing Orthogonal Constraint in Structural ProbesCode0
Towards Better UD Parsing: Deep Contextualized Word Embeddings, Ensemble, and Treebank ConcatenationCode0
word2vec Explained: deriving Mikolov et al.'s negative-sampling word-embedding methodCode0
Detecting Anxiety through RedditCode0
Named Entity Recognition with Bidirectional LSTM-CNNsCode0
Show:102550
← PrevPage 327 of 401Next →

No leaderboard results yet.