SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 37713780 of 4002 papers

TitleStatusHype
En-Ar Bilingual Word Embeddings without Word Alignment: Factors Effects0
Encoders Help You Disambiguate Word Senses in Neural Machine Translation0
Encoding Prior Knowledge with Eigenword Embeddings0
Encoding Sentiment Information into Word Vectors for Sentiment Analysis0
End-to-End Entity Linking and Disambiguation leveraging Word and Knowledge Graph Embeddings0
ENGLAWI: From Human- to Machine-Readable Wiktionary0
English-Malay Cross-Lingual Embedding Alignment using Bilingual Lexicon Augmentation0
English-Malay Word Embeddings Alignment for Cross-lingual Emotion Classification with Hierarchical Attention Network0
English Resource Semantics0
English WordNet Random Walk Pseudo-Corpora0
Show:102550
← PrevPage 378 of 401Next →

No leaderboard results yet.