SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 24312440 of 4002 papers

TitleStatusHype
Bringing Order to Neural Word Embeddings with Embeddings Augmented by Random Permutations (EARP)0
SParse: Ko University Graph-Based Parsing System for the CoNLL 2018 Shared Task0
A Morphology-Based Representation Model for LSTM-Based Dependency Parsing of Agglutinative LanguagesCode0
Modelling Salient Features as Directions in Fine-Tuned Semantic SpacesCode0
Resources to Examine the Quality of Word Embedding Models Trained on n-Gram Data0
CEA LIST: Processing Low-Resource Languages for CoNLL 2018Code0
Learning Text Representations for 500K Classification Tasks on Named Entity DisambiguationCode0
Turku Neural Parser Pipeline: An End-to-End System for the CoNLL 2018 Shared Task0
Phrase-level Self-Attention Networks for Universal Sentence Encoding0
Joint Learning for Targeted Sentiment Analysis0
Show:102550
← PrevPage 244 of 401Next →

No leaderboard results yet.