SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25412550 of 4002 papers

TitleStatusHype
JeSemE: Interleaving Semantics and Emotions in a Web Service for the Exploration of Language Change Phenomena0
A LSTM Approach with Sub-Word Embeddings for Mongolian Phrase Break Prediction0
AnlamVer: Semantic Model Evaluation Dataset for Turkish - Word Similarity and RelatednessCode0
Learning to Generate Word Representations using Subword Information0
Modeling with Recurrent Neural Networks for Open Vocabulary Slots0
Toward Better Loanword Identification in Uyghur Using Cross-lingual Word Embeddings0
Improving Named Entity Recognition by Jointly Learning to Disambiguate Morphological TagsCode0
What's in Your Embedding, And How It Predicts Task Performance0
Neural Activation Semantic Models: Computational lexical semantic models of localized neural activationsCode0
Towards a unified framework for bilingual terminology extraction of single-word and multi-word termsCode0
Show:102550
← PrevPage 255 of 401Next →

No leaderboard results yet.