SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15711580 of 4002 papers

TitleStatusHype
Towards Entity Spaces0
Automatic Term Extraction from Newspaper Corpora: Making the Most of Specificity and Common Features0
Automatic Creation of Correspondence Table of Meaning Tags from Two Dictionaries in One Language Using Bilingual Word Embedding0
Lexicon-Enhancement of Embedding-based Approaches Towards the Detection of Abusive Language0
Czech Historical Named Entity Corpus v 1.00
Translating Knowledge Representations with Monolingual Word Embeddings: the Case of a Thesaurus on Corporate Non-Financial Reporting0
Automatically Building a Multilingual Lexicon of False Friends With No Supervision0
Automated Discovery of Mathematical Definitions in Text0
Analyzing Word Embedding Through Structural Equation Modeling0
Augmenting Small Data to Classify Contextualized Dialogue Acts for Exploratory Visualization0
Show:102550
← PrevPage 158 of 401Next →

No leaderboard results yet.