SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32513260 of 4002 papers

TitleStatusHype
SuperTML: Two-Dimensional Word Embedding for the Precognition on Structured Tabular DataCode0
Application of a Hybrid Bi-LSTM-CRF model to the task of Russian Named Entity RecognitionCode0
Interpreting Word Embeddings with Eigenvector AnalysisCode0
Revisiting Tri-training of Dependency ParsersCode0
Dependency Sensitive Convolutional Neural Networks for Modeling Sentences and DocumentsCode0
myNER: Contextualized Burmese Named Entity Recognition with Bidirectional LSTM and fastText Embeddings via Joint Training with POS TaggingCode0
Supervised Acoustic Embeddings And Their Transferability Across LanguagesCode0
NE-Table: A Neural key-value table for Named EntitiesCode0
Named-Entity Recognition for NorwegianCode0
Attentive Neural Network for Named Entity Recognition in VietnameseCode0
Show:102550
← PrevPage 326 of 401Next →

No leaderboard results yet.