SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21112120 of 4002 papers

TitleStatusHype
Empowering machine learning models with contextual knowledge for enhancing the detection of eating disorders in social media posts0
Empty Category Detection using Path Features and Distributed Case Frames0
Enabling Cognitive Intelligence Queries in Relational Databases using Low-dimensional Word Embeddings0
Enabling Open-World Specification Mining via Unsupervised Learning0
En-Ar Bilingual Word Embeddings without Word Alignment: Factors Effects0
Encoders Help You Disambiguate Word Senses in Neural Machine Translation0
Encoding Prior Knowledge with Eigenword Embeddings0
Encoding Sentiment Information into Word Vectors for Sentiment Analysis0
End-to-End Entity Linking and Disambiguation leveraging Word and Knowledge Graph Embeddings0
ENGLAWI: From Human- to Machine-Readable Wiktionary0
Show:102550
← PrevPage 212 of 401Next →

No leaderboard results yet.