SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32913300 of 4002 papers

TitleStatusHype
Robust Cross-lingual Hypernymy Detection using Dependency ContextCode0
Neural Activation Semantic Models: Computational lexical semantic models of localized neural activationsCode0
Bilingual Learning of Multi-sense Embeddings with Discrete AutoencodersCode0
Robust Gram EmbeddingsCode0
Neural-based Noise Filtering from Word EmbeddingsCode0
Robustness and Reliability of Gender Bias Assessment in Word Embeddings: The Role of Base PairsCode0
Diachronic Analysis of German Parliamentary Proceedings: Ideological Shifts through the Lens of Political BiasesCode0
Neural Cross-Lingual Named Entity Recognition with Minimal ResourcesCode0
Robust to Noise Models in Natural Language Processing TasksCode0
Italian Event Detection Goes Deep LearningCode0
Show:102550
← PrevPage 330 of 401Next →

No leaderboard results yet.